00:00:00.001 Started by upstream project "autotest-per-patch" build number 124217 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.063 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.064 The recommended git tool is: git 00:00:00.064 using credential 00000000-0000-0000-0000-000000000002 00:00:00.066 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.093 Fetching changes from the remote Git repository 00:00:00.095 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.139 Using shallow fetch with depth 1 00:00:00.139 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.139 > git --version # timeout=10 00:00:00.177 > git --version # 'git version 2.39.2' 00:00:00.177 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.215 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.215 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.954 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.966 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.977 Checking out Revision 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 (FETCH_HEAD) 00:00:04.977 > git config core.sparsecheckout # timeout=10 00:00:04.988 > git read-tree -mu HEAD # timeout=10 00:00:05.003 > git checkout -f 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=5 00:00:05.020 Commit message: "pool: fixes for VisualBuild class" 00:00:05.020 > git rev-list --no-walk 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=10 00:00:05.096 [Pipeline] Start of Pipeline 00:00:05.106 [Pipeline] library 00:00:05.107 Loading library shm_lib@master 00:00:05.107 Library shm_lib@master is cached. Copying from home. 00:00:05.126 [Pipeline] node 00:00:05.135 Running on CYP8 in /var/jenkins/workspace/crypto-phy-autotest 00:00:05.137 [Pipeline] { 00:00:05.148 [Pipeline] catchError 00:00:05.151 [Pipeline] { 00:00:05.163 [Pipeline] wrap 00:00:05.171 [Pipeline] { 00:00:05.177 [Pipeline] stage 00:00:05.178 [Pipeline] { (Prologue) 00:00:05.354 [Pipeline] sh 00:00:05.641 + logger -p user.info -t JENKINS-CI 00:00:05.658 [Pipeline] echo 00:00:05.659 Node: CYP8 00:00:05.666 [Pipeline] sh 00:00:05.966 [Pipeline] setCustomBuildProperty 00:00:05.979 [Pipeline] echo 00:00:05.980 Cleanup processes 00:00:05.986 [Pipeline] sh 00:00:06.275 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.275 1297909 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.289 [Pipeline] sh 00:00:06.615 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.616 ++ grep -v 'sudo pgrep' 00:00:06.616 ++ awk '{print $1}' 00:00:06.616 + sudo kill -9 00:00:06.616 + true 00:00:06.629 [Pipeline] cleanWs 00:00:06.637 [WS-CLEANUP] Deleting project workspace... 00:00:06.637 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.644 [WS-CLEANUP] done 00:00:06.648 [Pipeline] setCustomBuildProperty 00:00:06.662 [Pipeline] sh 00:00:06.990 + sudo git config --global --replace-all safe.directory '*' 00:00:07.056 [Pipeline] nodesByLabel 00:00:07.058 Found a total of 2 nodes with the 'sorcerer' label 00:00:07.070 [Pipeline] httpRequest 00:00:07.075 HttpMethod: GET 00:00:07.076 URL: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:07.082 Sending request to url: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:07.089 Response Code: HTTP/1.1 200 OK 00:00:07.089 Success: Status code 200 is in the accepted range: 200,404 00:00:07.090 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:07.645 [Pipeline] sh 00:00:07.932 + tar --no-same-owner -xf jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:07.951 [Pipeline] httpRequest 00:00:07.956 HttpMethod: GET 00:00:07.957 URL: http://10.211.164.101/packages/spdk_c5b9f923d1f02be5c638708ffd4f439a17fc435d.tar.gz 00:00:07.957 Sending request to url: http://10.211.164.101/packages/spdk_c5b9f923d1f02be5c638708ffd4f439a17fc435d.tar.gz 00:00:07.961 Response Code: HTTP/1.1 200 OK 00:00:07.962 Success: Status code 200 is in the accepted range: 200,404 00:00:07.962 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_c5b9f923d1f02be5c638708ffd4f439a17fc435d.tar.gz 00:00:18.480 [Pipeline] sh 00:00:18.768 + tar --no-same-owner -xf spdk_c5b9f923d1f02be5c638708ffd4f439a17fc435d.tar.gz 00:00:22.082 [Pipeline] sh 00:00:22.370 + git -C spdk log --oneline -n5 00:00:22.370 c5b9f923d test/nvmf: run IO during TLS with kernel 00:00:22.370 25b1d44ec test: add a test for SPDK vs kernel TLS 00:00:22.370 7fc2ab43c scripts: add a keyctl session wrapper 00:00:22.370 00058f4d0 test/nvmf/common: do not use subnqn as model 00:00:22.370 fa40728d6 test/common: continue waitforserial on grep error 00:00:22.383 [Pipeline] } 00:00:22.403 [Pipeline] // stage 00:00:22.413 [Pipeline] stage 00:00:22.416 [Pipeline] { (Prepare) 00:00:22.435 [Pipeline] writeFile 00:00:22.453 [Pipeline] sh 00:00:22.743 + logger -p user.info -t JENKINS-CI 00:00:22.757 [Pipeline] sh 00:00:23.046 + logger -p user.info -t JENKINS-CI 00:00:23.062 [Pipeline] sh 00:00:23.350 + cat autorun-spdk.conf 00:00:23.350 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:23.350 SPDK_TEST_BLOCKDEV=1 00:00:23.350 SPDK_TEST_ISAL=1 00:00:23.350 SPDK_TEST_CRYPTO=1 00:00:23.350 SPDK_TEST_REDUCE=1 00:00:23.350 SPDK_TEST_VBDEV_COMPRESS=1 00:00:23.350 SPDK_RUN_UBSAN=1 00:00:23.358 RUN_NIGHTLY=0 00:00:23.364 [Pipeline] readFile 00:00:23.392 [Pipeline] withEnv 00:00:23.394 [Pipeline] { 00:00:23.409 [Pipeline] sh 00:00:23.697 + set -ex 00:00:23.697 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:23.697 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:23.697 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:23.697 ++ SPDK_TEST_BLOCKDEV=1 00:00:23.697 ++ SPDK_TEST_ISAL=1 00:00:23.697 ++ SPDK_TEST_CRYPTO=1 00:00:23.697 ++ SPDK_TEST_REDUCE=1 00:00:23.697 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:23.697 ++ SPDK_RUN_UBSAN=1 00:00:23.697 ++ RUN_NIGHTLY=0 00:00:23.697 + case $SPDK_TEST_NVMF_NICS in 00:00:23.697 + DRIVERS= 00:00:23.697 + [[ -n '' ]] 00:00:23.697 + exit 0 00:00:23.709 [Pipeline] } 00:00:23.727 [Pipeline] // withEnv 00:00:23.733 [Pipeline] } 00:00:23.745 [Pipeline] // stage 00:00:23.754 [Pipeline] catchError 00:00:23.755 [Pipeline] { 00:00:23.768 [Pipeline] timeout 00:00:23.768 Timeout set to expire in 40 min 00:00:23.770 [Pipeline] { 00:00:23.784 [Pipeline] stage 00:00:23.786 [Pipeline] { (Tests) 00:00:23.801 [Pipeline] sh 00:00:24.087 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:24.087 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:24.087 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:24.087 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:24.087 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:24.087 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:24.087 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:24.087 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:24.087 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:24.087 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:24.087 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:24.087 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:24.087 + source /etc/os-release 00:00:24.087 ++ NAME='Fedora Linux' 00:00:24.087 ++ VERSION='38 (Cloud Edition)' 00:00:24.087 ++ ID=fedora 00:00:24.087 ++ VERSION_ID=38 00:00:24.087 ++ VERSION_CODENAME= 00:00:24.087 ++ PLATFORM_ID=platform:f38 00:00:24.087 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:24.087 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:24.087 ++ LOGO=fedora-logo-icon 00:00:24.087 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:24.087 ++ HOME_URL=https://fedoraproject.org/ 00:00:24.087 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:24.087 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:24.087 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:24.087 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:24.087 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:24.087 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:24.087 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:24.087 ++ SUPPORT_END=2024-05-14 00:00:24.087 ++ VARIANT='Cloud Edition' 00:00:24.087 ++ VARIANT_ID=cloud 00:00:24.087 + uname -a 00:00:24.087 Linux spdk-cyp-08 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:24.087 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:27.387 Hugepages 00:00:27.387 node hugesize free / total 00:00:27.387 node0 1048576kB 0 / 0 00:00:27.387 node0 2048kB 0 / 0 00:00:27.387 node1 1048576kB 0 / 0 00:00:27.387 node1 2048kB 0 / 0 00:00:27.387 00:00:27.387 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:27.387 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:00:27.387 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:00:27.387 NVMe 0000:65:00.0 144d a80a 0 nvme nvme0 nvme0n1 00:00:27.387 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:00:27.387 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:00:27.387 + rm -f /tmp/spdk-ld-path 00:00:27.387 + source autorun-spdk.conf 00:00:27.387 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:27.387 ++ SPDK_TEST_BLOCKDEV=1 00:00:27.387 ++ SPDK_TEST_ISAL=1 00:00:27.387 ++ SPDK_TEST_CRYPTO=1 00:00:27.387 ++ SPDK_TEST_REDUCE=1 00:00:27.387 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:27.387 ++ SPDK_RUN_UBSAN=1 00:00:27.387 ++ RUN_NIGHTLY=0 00:00:27.387 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:27.387 + [[ -n '' ]] 00:00:27.387 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:27.387 + for M in /var/spdk/build-*-manifest.txt 00:00:27.387 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:27.387 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:27.387 + for M in /var/spdk/build-*-manifest.txt 00:00:27.387 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:27.387 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:27.387 ++ uname 00:00:27.387 + [[ Linux == \L\i\n\u\x ]] 00:00:27.387 + sudo dmesg -T 00:00:27.649 + sudo dmesg --clear 00:00:27.649 + dmesg_pid=1298994 00:00:27.649 + [[ Fedora Linux == FreeBSD ]] 00:00:27.649 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:27.649 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:27.649 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:27.649 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:00:27.649 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:00:27.649 + [[ -x /usr/src/fio-static/fio ]] 00:00:27.649 + export FIO_BIN=/usr/src/fio-static/fio 00:00:27.649 + FIO_BIN=/usr/src/fio-static/fio 00:00:27.649 + sudo dmesg -Tw 00:00:27.649 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:27.649 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:27.649 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:27.649 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:27.649 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:27.649 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:27.649 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:27.649 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:27.649 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:27.649 Test configuration: 00:00:27.649 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:27.649 SPDK_TEST_BLOCKDEV=1 00:00:27.649 SPDK_TEST_ISAL=1 00:00:27.649 SPDK_TEST_CRYPTO=1 00:00:27.649 SPDK_TEST_REDUCE=1 00:00:27.649 SPDK_TEST_VBDEV_COMPRESS=1 00:00:27.649 SPDK_RUN_UBSAN=1 00:00:27.649 RUN_NIGHTLY=0 13:28:42 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:27.649 13:28:42 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:27.649 13:28:42 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:27.649 13:28:42 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:27.649 13:28:42 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.649 13:28:42 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.649 13:28:42 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.649 13:28:42 -- paths/export.sh@5 -- $ export PATH 00:00:27.649 13:28:42 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.649 13:28:42 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:27.649 13:28:42 -- common/autobuild_common.sh@437 -- $ date +%s 00:00:27.649 13:28:42 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718018922.XXXXXX 00:00:27.649 13:28:42 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718018922.pUmRuu 00:00:27.649 13:28:42 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:00:27.649 13:28:42 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:00:27.649 13:28:42 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:27.649 13:28:42 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:27.649 13:28:42 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:27.649 13:28:42 -- common/autobuild_common.sh@453 -- $ get_config_params 00:00:27.649 13:28:42 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:27.649 13:28:42 -- common/autotest_common.sh@10 -- $ set +x 00:00:27.649 13:28:42 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:27.649 13:28:42 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:00:27.649 13:28:42 -- pm/common@17 -- $ local monitor 00:00:27.649 13:28:42 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:27.649 13:28:42 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:27.649 13:28:42 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:27.649 13:28:42 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:27.649 13:28:42 -- pm/common@21 -- $ date +%s 00:00:27.649 13:28:42 -- pm/common@25 -- $ sleep 1 00:00:27.649 13:28:42 -- pm/common@21 -- $ date +%s 00:00:27.649 13:28:42 -- pm/common@21 -- $ date +%s 00:00:27.649 13:28:42 -- pm/common@21 -- $ date +%s 00:00:27.649 13:28:42 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718018922 00:00:27.649 13:28:42 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718018922 00:00:27.649 13:28:42 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718018922 00:00:27.649 13:28:42 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718018922 00:00:27.911 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718018922_collect-vmstat.pm.log 00:00:27.911 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718018922_collect-cpu-load.pm.log 00:00:27.911 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718018922_collect-cpu-temp.pm.log 00:00:27.911 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718018922_collect-bmc-pm.bmc.pm.log 00:00:28.852 13:28:43 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:00:28.852 13:28:43 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:28.852 13:28:43 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:28.852 13:28:43 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:28.852 13:28:43 -- spdk/autobuild.sh@16 -- $ date -u 00:00:28.852 Mon Jun 10 11:28:43 AM UTC 2024 00:00:28.852 13:28:43 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:28.852 v24.09-pre-61-gc5b9f923d 00:00:28.852 13:28:43 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:28.852 13:28:43 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:28.852 13:28:43 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:28.852 13:28:43 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:00:28.852 13:28:43 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:00:28.852 13:28:43 -- common/autotest_common.sh@10 -- $ set +x 00:00:28.852 ************************************ 00:00:28.852 START TEST ubsan 00:00:28.852 ************************************ 00:00:28.852 13:28:43 ubsan -- common/autotest_common.sh@1124 -- $ echo 'using ubsan' 00:00:28.852 using ubsan 00:00:28.852 00:00:28.852 real 0m0.001s 00:00:28.852 user 0m0.000s 00:00:28.852 sys 0m0.001s 00:00:28.852 13:28:43 ubsan -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:00:28.852 13:28:43 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:28.852 ************************************ 00:00:28.852 END TEST ubsan 00:00:28.852 ************************************ 00:00:28.852 13:28:43 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:28.852 13:28:43 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:28.852 13:28:43 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:28.852 13:28:43 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:28.852 13:28:43 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:28.852 13:28:43 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:28.852 13:28:43 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:28.852 13:28:43 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:28.852 13:28:43 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:28.852 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:28.852 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:29.423 Using 'verbs' RDMA provider 00:00:45.711 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:00:57.942 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:00:57.942 Creating mk/config.mk...done. 00:00:57.942 Creating mk/cc.flags.mk...done. 00:00:57.942 Type 'make' to build. 00:00:57.942 13:29:11 -- spdk/autobuild.sh@69 -- $ run_test make make -j144 00:00:57.942 13:29:11 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:00:57.942 13:29:11 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:00:57.942 13:29:11 -- common/autotest_common.sh@10 -- $ set +x 00:00:57.942 ************************************ 00:00:57.942 START TEST make 00:00:57.942 ************************************ 00:00:57.942 13:29:11 make -- common/autotest_common.sh@1124 -- $ make -j144 00:00:57.942 make[1]: Nothing to be done for 'all'. 00:01:30.037 The Meson build system 00:01:30.037 Version: 1.3.1 00:01:30.037 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:30.037 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:30.037 Build type: native build 00:01:30.037 Program cat found: YES (/usr/bin/cat) 00:01:30.037 Project name: DPDK 00:01:30.037 Project version: 24.03.0 00:01:30.037 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:30.037 C linker for the host machine: cc ld.bfd 2.39-16 00:01:30.037 Host machine cpu family: x86_64 00:01:30.037 Host machine cpu: x86_64 00:01:30.037 Message: ## Building in Developer Mode ## 00:01:30.037 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:30.037 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:30.037 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:30.037 Program python3 found: YES (/usr/bin/python3) 00:01:30.037 Program cat found: YES (/usr/bin/cat) 00:01:30.037 Compiler for C supports arguments -march=native: YES 00:01:30.037 Checking for size of "void *" : 8 00:01:30.037 Checking for size of "void *" : 8 (cached) 00:01:30.037 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:30.037 Library m found: YES 00:01:30.037 Library numa found: YES 00:01:30.037 Has header "numaif.h" : YES 00:01:30.037 Library fdt found: NO 00:01:30.037 Library execinfo found: NO 00:01:30.037 Has header "execinfo.h" : YES 00:01:30.037 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:30.037 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:30.037 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:30.037 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:30.037 Run-time dependency openssl found: YES 3.0.9 00:01:30.037 Run-time dependency libpcap found: YES 1.10.4 00:01:30.037 Has header "pcap.h" with dependency libpcap: YES 00:01:30.037 Compiler for C supports arguments -Wcast-qual: YES 00:01:30.037 Compiler for C supports arguments -Wdeprecated: YES 00:01:30.037 Compiler for C supports arguments -Wformat: YES 00:01:30.037 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:30.037 Compiler for C supports arguments -Wformat-security: NO 00:01:30.037 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:30.037 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:30.037 Compiler for C supports arguments -Wnested-externs: YES 00:01:30.037 Compiler for C supports arguments -Wold-style-definition: YES 00:01:30.037 Compiler for C supports arguments -Wpointer-arith: YES 00:01:30.037 Compiler for C supports arguments -Wsign-compare: YES 00:01:30.037 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:30.037 Compiler for C supports arguments -Wundef: YES 00:01:30.037 Compiler for C supports arguments -Wwrite-strings: YES 00:01:30.037 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:30.037 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:30.037 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:30.037 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:30.037 Program objdump found: YES (/usr/bin/objdump) 00:01:30.037 Compiler for C supports arguments -mavx512f: YES 00:01:30.037 Checking if "AVX512 checking" compiles: YES 00:01:30.037 Fetching value of define "__SSE4_2__" : 1 00:01:30.037 Fetching value of define "__AES__" : 1 00:01:30.037 Fetching value of define "__AVX__" : 1 00:01:30.037 Fetching value of define "__AVX2__" : 1 00:01:30.037 Fetching value of define "__AVX512BW__" : 1 00:01:30.037 Fetching value of define "__AVX512CD__" : 1 00:01:30.037 Fetching value of define "__AVX512DQ__" : 1 00:01:30.037 Fetching value of define "__AVX512F__" : 1 00:01:30.037 Fetching value of define "__AVX512VL__" : 1 00:01:30.037 Fetching value of define "__PCLMUL__" : 1 00:01:30.037 Fetching value of define "__RDRND__" : 1 00:01:30.037 Fetching value of define "__RDSEED__" : 1 00:01:30.037 Fetching value of define "__VPCLMULQDQ__" : 1 00:01:30.037 Fetching value of define "__znver1__" : (undefined) 00:01:30.037 Fetching value of define "__znver2__" : (undefined) 00:01:30.037 Fetching value of define "__znver3__" : (undefined) 00:01:30.037 Fetching value of define "__znver4__" : (undefined) 00:01:30.037 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:30.037 Message: lib/log: Defining dependency "log" 00:01:30.037 Message: lib/kvargs: Defining dependency "kvargs" 00:01:30.037 Message: lib/telemetry: Defining dependency "telemetry" 00:01:30.037 Checking for function "getentropy" : NO 00:01:30.037 Message: lib/eal: Defining dependency "eal" 00:01:30.037 Message: lib/ring: Defining dependency "ring" 00:01:30.037 Message: lib/rcu: Defining dependency "rcu" 00:01:30.037 Message: lib/mempool: Defining dependency "mempool" 00:01:30.037 Message: lib/mbuf: Defining dependency "mbuf" 00:01:30.037 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:30.037 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:30.037 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:30.037 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:30.037 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:30.037 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:01:30.037 Compiler for C supports arguments -mpclmul: YES 00:01:30.037 Compiler for C supports arguments -maes: YES 00:01:30.037 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:30.037 Compiler for C supports arguments -mavx512bw: YES 00:01:30.037 Compiler for C supports arguments -mavx512dq: YES 00:01:30.037 Compiler for C supports arguments -mavx512vl: YES 00:01:30.037 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:30.037 Compiler for C supports arguments -mavx2: YES 00:01:30.037 Compiler for C supports arguments -mavx: YES 00:01:30.037 Message: lib/net: Defining dependency "net" 00:01:30.037 Message: lib/meter: Defining dependency "meter" 00:01:30.037 Message: lib/ethdev: Defining dependency "ethdev" 00:01:30.037 Message: lib/pci: Defining dependency "pci" 00:01:30.037 Message: lib/cmdline: Defining dependency "cmdline" 00:01:30.037 Message: lib/hash: Defining dependency "hash" 00:01:30.037 Message: lib/timer: Defining dependency "timer" 00:01:30.037 Message: lib/compressdev: Defining dependency "compressdev" 00:01:30.037 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:30.037 Message: lib/dmadev: Defining dependency "dmadev" 00:01:30.037 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:30.037 Message: lib/power: Defining dependency "power" 00:01:30.037 Message: lib/reorder: Defining dependency "reorder" 00:01:30.037 Message: lib/security: Defining dependency "security" 00:01:30.037 Has header "linux/userfaultfd.h" : YES 00:01:30.037 Has header "linux/vduse.h" : YES 00:01:30.037 Message: lib/vhost: Defining dependency "vhost" 00:01:30.037 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:30.037 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:30.037 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:30.037 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:30.037 Compiler for C supports arguments -std=c11: YES 00:01:30.037 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:30.037 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:30.037 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:30.037 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:30.037 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:30.037 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:30.037 Library mtcr_ul found: NO 00:01:30.037 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:30.037 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:30.037 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:30.037 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:30.038 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:31.425 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:31.426 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:31.426 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:31.426 Configuring mlx5_autoconf.h using configuration 00:01:31.426 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:31.426 Run-time dependency libcrypto found: YES 3.0.9 00:01:31.426 Library IPSec_MB found: YES 00:01:31.426 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:31.426 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:31.426 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:31.426 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:31.426 Library IPSec_MB found: YES 00:01:31.426 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:31.426 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:31.426 Compiler for C supports arguments -std=c11: YES (cached) 00:01:31.426 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:31.426 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:31.426 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:31.426 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:31.426 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:31.426 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:31.426 Library libisal found: NO 00:01:31.426 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:31.426 Compiler for C supports arguments -std=c11: YES (cached) 00:01:31.426 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:31.426 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:31.426 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:31.426 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:31.426 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:31.426 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:31.426 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:31.426 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:31.426 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:31.426 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:31.426 Program doxygen found: YES (/usr/bin/doxygen) 00:01:31.426 Configuring doxy-api-html.conf using configuration 00:01:31.426 Configuring doxy-api-man.conf using configuration 00:01:31.426 Program mandb found: YES (/usr/bin/mandb) 00:01:31.426 Program sphinx-build found: NO 00:01:31.426 Configuring rte_build_config.h using configuration 00:01:31.426 Message: 00:01:31.426 ================= 00:01:31.426 Applications Enabled 00:01:31.426 ================= 00:01:31.426 00:01:31.426 apps: 00:01:31.426 00:01:31.426 00:01:31.426 Message: 00:01:31.426 ================= 00:01:31.426 Libraries Enabled 00:01:31.426 ================= 00:01:31.426 00:01:31.426 libs: 00:01:31.426 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:31.426 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:31.426 cryptodev, dmadev, power, reorder, security, vhost, 00:01:31.426 00:01:31.426 Message: 00:01:31.426 =============== 00:01:31.426 Drivers Enabled 00:01:31.426 =============== 00:01:31.426 00:01:31.426 common: 00:01:31.426 mlx5, qat, 00:01:31.426 bus: 00:01:31.426 auxiliary, pci, vdev, 00:01:31.426 mempool: 00:01:31.426 ring, 00:01:31.426 dma: 00:01:31.426 00:01:31.426 net: 00:01:31.426 00:01:31.426 crypto: 00:01:31.426 ipsec_mb, mlx5, 00:01:31.426 compress: 00:01:31.426 isal, mlx5, 00:01:31.426 vdpa: 00:01:31.426 00:01:31.426 00:01:31.426 Message: 00:01:31.426 ================= 00:01:31.426 Content Skipped 00:01:31.426 ================= 00:01:31.426 00:01:31.426 apps: 00:01:31.426 dumpcap: explicitly disabled via build config 00:01:31.426 graph: explicitly disabled via build config 00:01:31.426 pdump: explicitly disabled via build config 00:01:31.426 proc-info: explicitly disabled via build config 00:01:31.426 test-acl: explicitly disabled via build config 00:01:31.426 test-bbdev: explicitly disabled via build config 00:01:31.426 test-cmdline: explicitly disabled via build config 00:01:31.426 test-compress-perf: explicitly disabled via build config 00:01:31.426 test-crypto-perf: explicitly disabled via build config 00:01:31.426 test-dma-perf: explicitly disabled via build config 00:01:31.426 test-eventdev: explicitly disabled via build config 00:01:31.426 test-fib: explicitly disabled via build config 00:01:31.426 test-flow-perf: explicitly disabled via build config 00:01:31.426 test-gpudev: explicitly disabled via build config 00:01:31.426 test-mldev: explicitly disabled via build config 00:01:31.426 test-pipeline: explicitly disabled via build config 00:01:31.426 test-pmd: explicitly disabled via build config 00:01:31.426 test-regex: explicitly disabled via build config 00:01:31.426 test-sad: explicitly disabled via build config 00:01:31.426 test-security-perf: explicitly disabled via build config 00:01:31.426 00:01:31.426 libs: 00:01:31.426 argparse: explicitly disabled via build config 00:01:31.426 metrics: explicitly disabled via build config 00:01:31.426 acl: explicitly disabled via build config 00:01:31.426 bbdev: explicitly disabled via build config 00:01:31.426 bitratestats: explicitly disabled via build config 00:01:31.426 bpf: explicitly disabled via build config 00:01:31.426 cfgfile: explicitly disabled via build config 00:01:31.426 distributor: explicitly disabled via build config 00:01:31.426 efd: explicitly disabled via build config 00:01:31.426 eventdev: explicitly disabled via build config 00:01:31.426 dispatcher: explicitly disabled via build config 00:01:31.426 gpudev: explicitly disabled via build config 00:01:31.426 gro: explicitly disabled via build config 00:01:31.426 gso: explicitly disabled via build config 00:01:31.426 ip_frag: explicitly disabled via build config 00:01:31.426 jobstats: explicitly disabled via build config 00:01:31.426 latencystats: explicitly disabled via build config 00:01:31.426 lpm: explicitly disabled via build config 00:01:31.426 member: explicitly disabled via build config 00:01:31.426 pcapng: explicitly disabled via build config 00:01:31.426 rawdev: explicitly disabled via build config 00:01:31.426 regexdev: explicitly disabled via build config 00:01:31.426 mldev: explicitly disabled via build config 00:01:31.426 rib: explicitly disabled via build config 00:01:31.426 sched: explicitly disabled via build config 00:01:31.426 stack: explicitly disabled via build config 00:01:31.426 ipsec: explicitly disabled via build config 00:01:31.426 pdcp: explicitly disabled via build config 00:01:31.426 fib: explicitly disabled via build config 00:01:31.426 port: explicitly disabled via build config 00:01:31.426 pdump: explicitly disabled via build config 00:01:31.426 table: explicitly disabled via build config 00:01:31.426 pipeline: explicitly disabled via build config 00:01:31.426 graph: explicitly disabled via build config 00:01:31.426 node: explicitly disabled via build config 00:01:31.426 00:01:31.426 drivers: 00:01:31.426 common/cpt: not in enabled drivers build config 00:01:31.426 common/dpaax: not in enabled drivers build config 00:01:31.427 common/iavf: not in enabled drivers build config 00:01:31.427 common/idpf: not in enabled drivers build config 00:01:31.427 common/ionic: not in enabled drivers build config 00:01:31.427 common/mvep: not in enabled drivers build config 00:01:31.427 common/octeontx: not in enabled drivers build config 00:01:31.427 bus/cdx: not in enabled drivers build config 00:01:31.427 bus/dpaa: not in enabled drivers build config 00:01:31.427 bus/fslmc: not in enabled drivers build config 00:01:31.427 bus/ifpga: not in enabled drivers build config 00:01:31.427 bus/platform: not in enabled drivers build config 00:01:31.427 bus/uacce: not in enabled drivers build config 00:01:31.427 bus/vmbus: not in enabled drivers build config 00:01:31.427 common/cnxk: not in enabled drivers build config 00:01:31.427 common/nfp: not in enabled drivers build config 00:01:31.427 common/nitrox: not in enabled drivers build config 00:01:31.427 common/sfc_efx: not in enabled drivers build config 00:01:31.427 mempool/bucket: not in enabled drivers build config 00:01:31.427 mempool/cnxk: not in enabled drivers build config 00:01:31.427 mempool/dpaa: not in enabled drivers build config 00:01:31.427 mempool/dpaa2: not in enabled drivers build config 00:01:31.427 mempool/octeontx: not in enabled drivers build config 00:01:31.427 mempool/stack: not in enabled drivers build config 00:01:31.427 dma/cnxk: not in enabled drivers build config 00:01:31.427 dma/dpaa: not in enabled drivers build config 00:01:31.427 dma/dpaa2: not in enabled drivers build config 00:01:31.427 dma/hisilicon: not in enabled drivers build config 00:01:31.427 dma/idxd: not in enabled drivers build config 00:01:31.427 dma/ioat: not in enabled drivers build config 00:01:31.427 dma/skeleton: not in enabled drivers build config 00:01:31.427 net/af_packet: not in enabled drivers build config 00:01:31.427 net/af_xdp: not in enabled drivers build config 00:01:31.427 net/ark: not in enabled drivers build config 00:01:31.427 net/atlantic: not in enabled drivers build config 00:01:31.427 net/avp: not in enabled drivers build config 00:01:31.427 net/axgbe: not in enabled drivers build config 00:01:31.427 net/bnx2x: not in enabled drivers build config 00:01:31.427 net/bnxt: not in enabled drivers build config 00:01:31.427 net/bonding: not in enabled drivers build config 00:01:31.427 net/cnxk: not in enabled drivers build config 00:01:31.427 net/cpfl: not in enabled drivers build config 00:01:31.427 net/cxgbe: not in enabled drivers build config 00:01:31.427 net/dpaa: not in enabled drivers build config 00:01:31.427 net/dpaa2: not in enabled drivers build config 00:01:31.427 net/e1000: not in enabled drivers build config 00:01:31.427 net/ena: not in enabled drivers build config 00:01:31.427 net/enetc: not in enabled drivers build config 00:01:31.427 net/enetfec: not in enabled drivers build config 00:01:31.427 net/enic: not in enabled drivers build config 00:01:31.427 net/failsafe: not in enabled drivers build config 00:01:31.427 net/fm10k: not in enabled drivers build config 00:01:31.427 net/gve: not in enabled drivers build config 00:01:31.427 net/hinic: not in enabled drivers build config 00:01:31.427 net/hns3: not in enabled drivers build config 00:01:31.427 net/i40e: not in enabled drivers build config 00:01:31.427 net/iavf: not in enabled drivers build config 00:01:31.427 net/ice: not in enabled drivers build config 00:01:31.427 net/idpf: not in enabled drivers build config 00:01:31.427 net/igc: not in enabled drivers build config 00:01:31.427 net/ionic: not in enabled drivers build config 00:01:31.427 net/ipn3ke: not in enabled drivers build config 00:01:31.427 net/ixgbe: not in enabled drivers build config 00:01:31.427 net/mana: not in enabled drivers build config 00:01:31.427 net/memif: not in enabled drivers build config 00:01:31.427 net/mlx4: not in enabled drivers build config 00:01:31.427 net/mlx5: not in enabled drivers build config 00:01:31.427 net/mvneta: not in enabled drivers build config 00:01:31.427 net/mvpp2: not in enabled drivers build config 00:01:31.427 net/netvsc: not in enabled drivers build config 00:01:31.427 net/nfb: not in enabled drivers build config 00:01:31.427 net/nfp: not in enabled drivers build config 00:01:31.427 net/ngbe: not in enabled drivers build config 00:01:31.427 net/null: not in enabled drivers build config 00:01:31.427 net/octeontx: not in enabled drivers build config 00:01:31.427 net/octeon_ep: not in enabled drivers build config 00:01:31.427 net/pcap: not in enabled drivers build config 00:01:31.427 net/pfe: not in enabled drivers build config 00:01:31.427 net/qede: not in enabled drivers build config 00:01:31.427 net/ring: not in enabled drivers build config 00:01:31.427 net/sfc: not in enabled drivers build config 00:01:31.427 net/softnic: not in enabled drivers build config 00:01:31.427 net/tap: not in enabled drivers build config 00:01:31.427 net/thunderx: not in enabled drivers build config 00:01:31.427 net/txgbe: not in enabled drivers build config 00:01:31.427 net/vdev_netvsc: not in enabled drivers build config 00:01:31.427 net/vhost: not in enabled drivers build config 00:01:31.427 net/virtio: not in enabled drivers build config 00:01:31.427 net/vmxnet3: not in enabled drivers build config 00:01:31.427 raw/*: missing internal dependency, "rawdev" 00:01:31.427 crypto/armv8: not in enabled drivers build config 00:01:31.427 crypto/bcmfs: not in enabled drivers build config 00:01:31.427 crypto/caam_jr: not in enabled drivers build config 00:01:31.427 crypto/ccp: not in enabled drivers build config 00:01:31.427 crypto/cnxk: not in enabled drivers build config 00:01:31.427 crypto/dpaa_sec: not in enabled drivers build config 00:01:31.427 crypto/dpaa2_sec: not in enabled drivers build config 00:01:31.427 crypto/mvsam: not in enabled drivers build config 00:01:31.427 crypto/nitrox: not in enabled drivers build config 00:01:31.427 crypto/null: not in enabled drivers build config 00:01:31.427 crypto/octeontx: not in enabled drivers build config 00:01:31.427 crypto/openssl: not in enabled drivers build config 00:01:31.427 crypto/scheduler: not in enabled drivers build config 00:01:31.427 crypto/uadk: not in enabled drivers build config 00:01:31.427 crypto/virtio: not in enabled drivers build config 00:01:31.427 compress/nitrox: not in enabled drivers build config 00:01:31.427 compress/octeontx: not in enabled drivers build config 00:01:31.427 compress/zlib: not in enabled drivers build config 00:01:31.427 regex/*: missing internal dependency, "regexdev" 00:01:31.427 ml/*: missing internal dependency, "mldev" 00:01:31.427 vdpa/ifc: not in enabled drivers build config 00:01:31.427 vdpa/mlx5: not in enabled drivers build config 00:01:31.427 vdpa/nfp: not in enabled drivers build config 00:01:31.427 vdpa/sfc: not in enabled drivers build config 00:01:31.427 event/*: missing internal dependency, "eventdev" 00:01:31.427 baseband/*: missing internal dependency, "bbdev" 00:01:31.427 gpu/*: missing internal dependency, "gpudev" 00:01:31.427 00:01:31.427 00:01:31.427 Build targets in project: 114 00:01:31.427 00:01:31.427 DPDK 24.03.0 00:01:31.427 00:01:31.427 User defined options 00:01:31.427 buildtype : debug 00:01:31.427 default_library : shared 00:01:31.427 libdir : lib 00:01:31.427 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:31.427 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:31.427 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:31.427 cpu_instruction_set: native 00:01:31.427 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:01:31.427 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,argparse,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:01:31.427 enable_docs : false 00:01:31.427 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:31.427 enable_kmods : false 00:01:31.427 tests : false 00:01:31.427 00:01:31.427 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:32.014 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:32.014 [1/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:32.014 [2/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:32.014 [3/377] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:32.014 [4/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:32.014 [5/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:32.014 [6/377] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:32.014 [7/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:32.310 [8/377] Linking static target lib/librte_kvargs.a 00:01:32.310 [9/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:32.310 [10/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:32.310 [11/377] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:32.310 [12/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:32.310 [13/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:32.310 [14/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:32.310 [15/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:32.310 [16/377] Linking static target lib/librte_log.a 00:01:32.310 [17/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:32.310 [18/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:32.310 [19/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:32.310 [20/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:32.310 [21/377] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:32.310 [22/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:32.310 [23/377] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:32.310 [24/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:32.310 [25/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:32.310 [26/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:32.310 [27/377] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:32.310 [28/377] Linking static target lib/librte_pci.a 00:01:32.310 [29/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:32.310 [30/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:32.310 [31/377] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:32.310 [32/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:32.310 [33/377] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:32.310 [34/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:32.634 [35/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:32.634 [36/377] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:32.634 [37/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:32.634 [38/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:01:32.634 [39/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:32.634 [40/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:32.634 [41/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:32.634 [42/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:32.634 [43/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:32.634 [44/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:32.634 [45/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:32.634 [46/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:32.634 [47/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:32.634 [48/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:32.634 [49/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:32.634 [50/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:32.924 [51/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:32.924 [52/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:32.924 [53/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:32.924 [54/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:32.924 [55/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:32.924 [56/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:32.924 [57/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:32.924 [58/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:32.924 [59/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:32.924 [60/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:32.924 [61/377] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.924 [62/377] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:32.924 [63/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:32.924 [64/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:32.924 [65/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:32.924 [66/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:32.924 [67/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:32.924 [68/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:32.924 [69/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:32.924 [70/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:32.924 [71/377] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:01:32.924 [72/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:32.924 [73/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:32.924 [74/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:32.924 [75/377] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:32.924 [76/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:32.924 [77/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:32.924 [78/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:32.924 [79/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:32.924 [80/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:32.924 [81/377] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:32.924 [82/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:32.924 [83/377] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:32.924 [84/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:32.924 [85/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:32.924 [86/377] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:32.924 [87/377] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:32.924 [88/377] Linking static target lib/librte_meter.a 00:01:32.924 [89/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:32.924 [90/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:32.924 [91/377] Linking static target lib/librte_ring.a 00:01:32.924 [92/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:32.924 [93/377] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:32.924 [94/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:32.924 [95/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:01:32.924 [96/377] Linking static target lib/librte_telemetry.a 00:01:32.924 [97/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:32.924 [98/377] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:32.924 [99/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:32.924 [100/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:32.924 [101/377] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:32.924 [102/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:32.924 [103/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:32.924 [104/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:32.924 [105/377] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:32.924 [106/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:32.924 [107/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:32.924 [108/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:32.924 [109/377] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:32.924 [110/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:32.924 [111/377] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:32.924 [112/377] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:32.924 [113/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:32.924 [114/377] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:32.924 [115/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:32.924 [116/377] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:32.924 [117/377] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:32.924 [118/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:32.924 [119/377] Linking static target lib/librte_cmdline.a 00:01:32.924 [120/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:32.924 [121/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:32.924 [122/377] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:32.924 [123/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:32.924 [124/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:32.924 [125/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:32.924 [126/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:32.924 [127/377] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:32.924 [128/377] Linking static target lib/librte_timer.a 00:01:32.924 [129/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:32.924 [130/377] Linking static target lib/librte_dmadev.a 00:01:32.924 [131/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:32.924 [132/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:32.924 [133/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:32.924 [134/377] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:32.924 [135/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:32.924 [136/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:32.924 [137/377] Linking static target lib/librte_mempool.a 00:01:32.924 [138/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:32.924 [139/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:01:32.924 [140/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:01:32.924 [141/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:32.924 [142/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:32.924 [143/377] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:01:32.924 [144/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:32.924 [145/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:32.924 [146/377] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:33.183 [147/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:33.183 [148/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:33.183 [149/377] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:33.183 [150/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:33.183 [151/377] Linking static target lib/librte_compressdev.a 00:01:33.183 [152/377] Linking static target lib/librte_net.a 00:01:33.183 [153/377] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:33.183 [154/377] Linking static target lib/librte_rcu.a 00:01:33.183 [155/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:33.183 [156/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:33.183 [157/377] Linking static target lib/librte_eal.a 00:01:33.183 [158/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:33.183 [159/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:33.183 [160/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:33.183 [161/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:33.183 [162/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:33.183 [163/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:01:33.183 [164/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:33.184 [165/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:33.184 [166/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:33.184 [167/377] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:33.184 [168/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:33.184 [169/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:33.184 [170/377] Linking static target lib/librte_power.a 00:01:33.184 [171/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:33.184 [172/377] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:33.184 [173/377] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:33.184 [174/377] Linking static target lib/librte_reorder.a 00:01:33.184 [175/377] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.184 [176/377] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:33.184 [177/377] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:33.184 [178/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:01:33.184 [179/377] Linking target lib/librte_log.so.24.1 00:01:33.184 [180/377] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.184 [181/377] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:33.184 [182/377] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:33.184 [183/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:33.184 [184/377] Linking static target lib/librte_mbuf.a 00:01:33.184 [185/377] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:33.184 [186/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:33.184 [187/377] Linking static target lib/librte_security.a 00:01:33.184 [188/377] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:33.184 [189/377] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:33.184 [190/377] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:33.184 [191/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:01:33.184 [192/377] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.184 [193/377] Linking static target lib/librte_hash.a 00:01:33.184 [194/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:01:33.184 [195/377] Linking static target drivers/librte_bus_vdev.a 00:01:33.184 [196/377] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:01:33.184 [197/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:01:33.184 [198/377] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:33.184 [199/377] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:33.184 [200/377] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:33.184 [201/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:01:33.444 [202/377] Linking static target drivers/librte_bus_auxiliary.a 00:01:33.444 [203/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:33.444 [204/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:01:33.444 [205/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:01:33.444 [206/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:01:33.444 [207/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:01:33.444 [208/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:01:33.444 [209/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:01:33.444 [210/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:01:33.444 [211/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:01:33.444 [212/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:01:33.444 [213/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:01:33.444 [214/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:01:33.444 [215/377] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.444 [216/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:01:33.444 [217/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:01:33.444 [218/377] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:33.444 [219/377] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:01:33.444 [220/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:01:33.444 [221/377] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.444 [222/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:33.444 [223/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:01:33.444 [224/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:01:33.444 [225/377] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:33.444 [226/377] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:33.444 [227/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:01:33.444 [228/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:01:33.444 [229/377] Linking target lib/librte_kvargs.so.24.1 00:01:33.444 [230/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:01:33.444 [231/377] Linking static target drivers/librte_bus_pci.a 00:01:33.444 [232/377] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:33.444 [233/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:01:33.444 [234/377] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:33.444 [235/377] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.444 [236/377] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.444 [237/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:01:33.444 [238/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:01:33.444 [239/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:01:33.444 [240/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:01:33.444 [241/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:01:33.444 [242/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:33.444 [243/377] Linking target lib/librte_telemetry.so.24.1 00:01:33.444 [244/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:01:33.444 [245/377] Linking static target drivers/libtmp_rte_compress_isal.a 00:01:33.444 [246/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:01:33.444 [247/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:01:33.444 [248/377] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:01:33.444 [249/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:01:33.444 [250/377] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:01:33.444 [251/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:33.444 [252/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:33.444 [253/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:01:33.444 [254/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:01:33.444 [255/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:01:33.444 [256/377] Linking static target lib/librte_cryptodev.a 00:01:33.444 [257/377] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:01:33.444 [258/377] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.705 [259/377] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.705 [260/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:01:33.705 [261/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:01:33.705 [262/377] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:33.705 [263/377] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.705 [264/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:01:33.705 [265/377] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.705 [266/377] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:01:33.705 [267/377] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:33.705 [268/377] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:01:33.705 [269/377] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:33.705 [270/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:01:33.705 [271/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:01:33.705 [272/377] Linking static target drivers/librte_mempool_ring.a 00:01:33.705 [273/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:01:33.705 [274/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:01:33.705 [275/377] Linking static target drivers/libtmp_rte_common_mlx5.a 00:01:33.705 [276/377] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:01:33.705 [277/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:01:33.705 [278/377] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:33.705 [279/377] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.705 [280/377] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:33.705 [281/377] Linking static target drivers/librte_compress_isal.a 00:01:33.705 [282/377] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:01:33.705 [283/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:01:33.705 [284/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:01:33.705 [285/377] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:33.705 [286/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:01:33.705 [287/377] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:33.705 [288/377] Linking static target drivers/librte_compress_mlx5.a 00:01:33.705 [289/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:01:33.705 [290/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:01:33.705 [291/377] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.964 [292/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:01:33.964 [293/377] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:01:33.964 [294/377] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.964 [295/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:01:33.964 [296/377] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:01:33.964 [297/377] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:33.964 [298/377] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:33.964 [299/377] Linking static target drivers/librte_crypto_mlx5.a 00:01:33.964 [300/377] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:01:33.964 [301/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:01:33.964 [302/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:33.964 [303/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:33.964 [304/377] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:33.964 [305/377] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:33.964 [306/377] Linking static target drivers/librte_common_mlx5.a 00:01:33.964 [307/377] Linking static target lib/librte_ethdev.a 00:01:33.964 [308/377] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:33.964 [309/377] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.224 [310/377] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:01:34.224 [311/377] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.224 [312/377] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:34.224 [313/377] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:34.224 [314/377] Linking static target drivers/librte_crypto_ipsec_mb.a 00:01:34.224 [315/377] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.224 [316/377] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.484 [317/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:01:34.745 [318/377] Linking static target drivers/libtmp_rte_common_qat.a 00:01:35.005 [319/377] Generating drivers/rte_common_qat.pmd.c with a custom command 00:01:35.005 [320/377] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:35.005 [321/377] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:35.005 [322/377] Linking static target drivers/librte_common_qat.a 00:01:35.005 [323/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:35.005 [324/377] Linking static target lib/librte_vhost.a 00:01:35.578 [325/377] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.487 [326/377] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.030 [327/377] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.239 [328/377] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.810 [329/377] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.810 [330/377] Linking target lib/librte_eal.so.24.1 00:01:45.071 [331/377] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:01:45.071 [332/377] Linking target lib/librte_ring.so.24.1 00:01:45.071 [333/377] Linking target lib/librte_timer.so.24.1 00:01:45.071 [334/377] Linking target lib/librte_meter.so.24.1 00:01:45.071 [335/377] Linking target lib/librte_pci.so.24.1 00:01:45.071 [336/377] Linking target lib/librte_dmadev.so.24.1 00:01:45.071 [337/377] Linking target drivers/librte_bus_auxiliary.so.24.1 00:01:45.071 [338/377] Linking target drivers/librte_bus_vdev.so.24.1 00:01:45.332 [339/377] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:01:45.332 [340/377] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:01:45.332 [341/377] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:01:45.332 [342/377] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:01:45.332 [343/377] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:01:45.332 [344/377] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:01:45.332 [345/377] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:01:45.332 [346/377] Linking target drivers/librte_bus_pci.so.24.1 00:01:45.332 [347/377] Linking target lib/librte_rcu.so.24.1 00:01:45.332 [348/377] Linking target lib/librte_mempool.so.24.1 00:01:45.593 [349/377] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:01:45.593 [350/377] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:01:45.593 [351/377] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:01:45.593 [352/377] Linking target drivers/librte_mempool_ring.so.24.1 00:01:45.593 [353/377] Linking target lib/librte_mbuf.so.24.1 00:01:45.593 [354/377] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:01:45.855 [355/377] Linking target lib/librte_compressdev.so.24.1 00:01:45.855 [356/377] Linking target lib/librte_reorder.so.24.1 00:01:45.855 [357/377] Linking target lib/librte_net.so.24.1 00:01:45.855 [358/377] Linking target lib/librte_cryptodev.so.24.1 00:01:45.855 [359/377] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:01:45.855 [360/377] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:01:45.855 [361/377] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:01:45.855 [362/377] Linking target lib/librte_cmdline.so.24.1 00:01:45.855 [363/377] Linking target lib/librte_hash.so.24.1 00:01:46.117 [364/377] Linking target drivers/librte_compress_isal.so.24.1 00:01:46.117 [365/377] Linking target lib/librte_ethdev.so.24.1 00:01:46.117 [366/377] Linking target lib/librte_security.so.24.1 00:01:46.117 [367/377] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:01:46.117 [368/377] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:01:46.117 [369/377] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:01:46.117 [370/377] Linking target lib/librte_power.so.24.1 00:01:46.117 [371/377] Linking target drivers/librte_common_mlx5.so.24.1 00:01:46.117 [372/377] Linking target lib/librte_vhost.so.24.1 00:01:46.378 [373/377] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:01:46.378 [374/377] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:01:46.378 [375/377] Linking target drivers/librte_common_qat.so.24.1 00:01:46.378 [376/377] Linking target drivers/librte_crypto_mlx5.so.24.1 00:01:46.378 [377/377] Linking target drivers/librte_compress_mlx5.so.24.1 00:01:46.378 INFO: autodetecting backend as ninja 00:01:46.378 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 144 00:01:47.765 CC lib/ut_mock/mock.o 00:01:47.765 CC lib/log/log.o 00:01:47.765 CC lib/log/log_flags.o 00:01:47.765 CC lib/log/log_deprecated.o 00:01:47.765 CC lib/ut/ut.o 00:01:47.765 LIB libspdk_log.a 00:01:47.765 LIB libspdk_ut_mock.a 00:01:48.026 LIB libspdk_ut.a 00:01:48.026 SO libspdk_log.so.7.0 00:01:48.026 SO libspdk_ut_mock.so.6.0 00:01:48.026 SO libspdk_ut.so.2.0 00:01:48.026 SYMLINK libspdk_ut_mock.so 00:01:48.026 SYMLINK libspdk_log.so 00:01:48.026 SYMLINK libspdk_ut.so 00:01:48.287 CC lib/util/base64.o 00:01:48.287 CC lib/util/cpuset.o 00:01:48.287 CC lib/util/bit_array.o 00:01:48.287 CC lib/util/crc16.o 00:01:48.287 CC lib/util/crc32.o 00:01:48.287 CC lib/util/crc32c.o 00:01:48.287 CC lib/util/crc32_ieee.o 00:01:48.287 CC lib/util/crc64.o 00:01:48.287 CC lib/ioat/ioat.o 00:01:48.287 CC lib/util/dif.o 00:01:48.287 CC lib/util/file.o 00:01:48.287 CC lib/util/fd.o 00:01:48.287 CC lib/util/hexlify.o 00:01:48.287 CC lib/util/iov.o 00:01:48.287 CC lib/util/math.o 00:01:48.287 CC lib/util/pipe.o 00:01:48.287 CC lib/util/strerror_tls.o 00:01:48.287 CXX lib/trace_parser/trace.o 00:01:48.287 CC lib/util/string.o 00:01:48.287 CC lib/util/uuid.o 00:01:48.287 CC lib/util/fd_group.o 00:01:48.287 CC lib/util/xor.o 00:01:48.287 CC lib/dma/dma.o 00:01:48.287 CC lib/util/zipf.o 00:01:48.548 CC lib/vfio_user/host/vfio_user_pci.o 00:01:48.548 CC lib/vfio_user/host/vfio_user.o 00:01:48.548 LIB libspdk_dma.a 00:01:48.548 SO libspdk_dma.so.4.0 00:01:48.548 LIB libspdk_ioat.a 00:01:48.548 SYMLINK libspdk_dma.so 00:01:48.548 SO libspdk_ioat.so.7.0 00:01:48.809 LIB libspdk_vfio_user.a 00:01:48.809 SYMLINK libspdk_ioat.so 00:01:48.809 SO libspdk_vfio_user.so.5.0 00:01:48.809 LIB libspdk_util.a 00:01:48.809 SYMLINK libspdk_vfio_user.so 00:01:48.809 SO libspdk_util.so.9.0 00:01:49.070 SYMLINK libspdk_util.so 00:01:49.070 LIB libspdk_trace_parser.a 00:01:49.070 SO libspdk_trace_parser.so.5.0 00:01:49.330 SYMLINK libspdk_trace_parser.so 00:01:49.330 CC lib/conf/conf.o 00:01:49.330 CC lib/json/json_parse.o 00:01:49.330 CC lib/json/json_util.o 00:01:49.330 CC lib/json/json_write.o 00:01:49.330 CC lib/reduce/reduce.o 00:01:49.330 CC lib/rdma/common.o 00:01:49.330 CC lib/rdma/rdma_verbs.o 00:01:49.330 CC lib/idxd/idxd.o 00:01:49.330 CC lib/idxd/idxd_user.o 00:01:49.330 CC lib/idxd/idxd_kernel.o 00:01:49.330 CC lib/env_dpdk/env.o 00:01:49.330 CC lib/env_dpdk/memory.o 00:01:49.330 CC lib/env_dpdk/pci.o 00:01:49.330 CC lib/vmd/vmd.o 00:01:49.330 CC lib/env_dpdk/init.o 00:01:49.330 CC lib/vmd/led.o 00:01:49.330 CC lib/env_dpdk/threads.o 00:01:49.330 CC lib/env_dpdk/pci_ioat.o 00:01:49.330 CC lib/env_dpdk/pci_virtio.o 00:01:49.330 CC lib/env_dpdk/pci_vmd.o 00:01:49.330 CC lib/env_dpdk/pci_idxd.o 00:01:49.330 CC lib/env_dpdk/pci_event.o 00:01:49.330 CC lib/env_dpdk/sigbus_handler.o 00:01:49.330 CC lib/env_dpdk/pci_dpdk.o 00:01:49.330 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:49.330 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:49.591 LIB libspdk_conf.a 00:01:49.591 SO libspdk_conf.so.6.0 00:01:49.591 LIB libspdk_rdma.a 00:01:49.591 LIB libspdk_json.a 00:01:49.852 SO libspdk_rdma.so.6.0 00:01:49.853 SO libspdk_json.so.6.0 00:01:49.853 SYMLINK libspdk_conf.so 00:01:49.853 SYMLINK libspdk_rdma.so 00:01:49.853 SYMLINK libspdk_json.so 00:01:49.853 LIB libspdk_idxd.a 00:01:49.853 SO libspdk_idxd.so.12.0 00:01:50.115 LIB libspdk_vmd.a 00:01:50.115 LIB libspdk_reduce.a 00:01:50.115 SYMLINK libspdk_idxd.so 00:01:50.115 SO libspdk_vmd.so.6.0 00:01:50.115 SO libspdk_reduce.so.6.0 00:01:50.115 SYMLINK libspdk_vmd.so 00:01:50.115 SYMLINK libspdk_reduce.so 00:01:50.115 CC lib/jsonrpc/jsonrpc_server.o 00:01:50.115 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:50.115 CC lib/jsonrpc/jsonrpc_client.o 00:01:50.115 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:50.376 LIB libspdk_jsonrpc.a 00:01:50.376 SO libspdk_jsonrpc.so.6.0 00:01:50.638 SYMLINK libspdk_jsonrpc.so 00:01:50.638 LIB libspdk_env_dpdk.a 00:01:50.638 SO libspdk_env_dpdk.so.14.0 00:01:50.898 SYMLINK libspdk_env_dpdk.so 00:01:50.898 CC lib/rpc/rpc.o 00:01:51.159 LIB libspdk_rpc.a 00:01:51.159 SO libspdk_rpc.so.6.0 00:01:51.159 SYMLINK libspdk_rpc.so 00:01:51.420 CC lib/notify/notify.o 00:01:51.420 CC lib/notify/notify_rpc.o 00:01:51.683 CC lib/trace/trace.o 00:01:51.683 CC lib/trace/trace_flags.o 00:01:51.683 CC lib/trace/trace_rpc.o 00:01:51.683 CC lib/keyring/keyring.o 00:01:51.683 CC lib/keyring/keyring_rpc.o 00:01:51.683 LIB libspdk_notify.a 00:01:51.683 SO libspdk_notify.so.6.0 00:01:51.683 LIB libspdk_keyring.a 00:01:51.943 SYMLINK libspdk_notify.so 00:01:51.943 LIB libspdk_trace.a 00:01:51.943 SO libspdk_keyring.so.1.0 00:01:51.943 SO libspdk_trace.so.10.0 00:01:51.943 SYMLINK libspdk_keyring.so 00:01:51.943 SYMLINK libspdk_trace.so 00:01:52.204 CC lib/thread/thread.o 00:01:52.204 CC lib/thread/iobuf.o 00:01:52.204 CC lib/sock/sock.o 00:01:52.204 CC lib/sock/sock_rpc.o 00:01:52.776 LIB libspdk_sock.a 00:01:52.776 SO libspdk_sock.so.10.0 00:01:52.776 SYMLINK libspdk_sock.so 00:01:53.036 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:53.036 CC lib/nvme/nvme_ctrlr.o 00:01:53.036 CC lib/nvme/nvme_fabric.o 00:01:53.036 CC lib/nvme/nvme_ns_cmd.o 00:01:53.036 CC lib/nvme/nvme_ns.o 00:01:53.036 CC lib/nvme/nvme_pcie_common.o 00:01:53.036 CC lib/nvme/nvme_pcie.o 00:01:53.036 CC lib/nvme/nvme_qpair.o 00:01:53.036 CC lib/nvme/nvme.o 00:01:53.036 CC lib/nvme/nvme_quirks.o 00:01:53.036 CC lib/nvme/nvme_transport.o 00:01:53.036 CC lib/nvme/nvme_discovery.o 00:01:53.036 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:53.036 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:53.036 CC lib/nvme/nvme_tcp.o 00:01:53.037 CC lib/nvme/nvme_opal.o 00:01:53.037 CC lib/nvme/nvme_io_msg.o 00:01:53.037 CC lib/nvme/nvme_poll_group.o 00:01:53.037 CC lib/nvme/nvme_zns.o 00:01:53.037 CC lib/nvme/nvme_stubs.o 00:01:53.037 CC lib/nvme/nvme_auth.o 00:01:53.037 CC lib/nvme/nvme_cuse.o 00:01:53.037 CC lib/nvme/nvme_rdma.o 00:01:53.602 LIB libspdk_thread.a 00:01:53.602 SO libspdk_thread.so.10.0 00:01:53.602 SYMLINK libspdk_thread.so 00:01:53.863 CC lib/blob/blobstore.o 00:01:53.863 CC lib/blob/request.o 00:01:53.863 CC lib/blob/zeroes.o 00:01:53.863 CC lib/blob/blob_bs_dev.o 00:01:53.863 CC lib/virtio/virtio.o 00:01:53.863 CC lib/accel/accel.o 00:01:53.863 CC lib/virtio/virtio_vhost_user.o 00:01:53.863 CC lib/accel/accel_rpc.o 00:01:53.863 CC lib/virtio/virtio_vfio_user.o 00:01:54.123 CC lib/accel/accel_sw.o 00:01:54.123 CC lib/virtio/virtio_pci.o 00:01:54.123 CC lib/init/json_config.o 00:01:54.123 CC lib/init/subsystem.o 00:01:54.123 CC lib/init/subsystem_rpc.o 00:01:54.123 CC lib/init/rpc.o 00:01:54.123 LIB libspdk_init.a 00:01:54.383 LIB libspdk_virtio.a 00:01:54.383 SO libspdk_init.so.5.0 00:01:54.383 SO libspdk_virtio.so.7.0 00:01:54.383 SYMLINK libspdk_init.so 00:01:54.383 SYMLINK libspdk_virtio.so 00:01:54.643 CC lib/event/app.o 00:01:54.643 CC lib/event/reactor.o 00:01:54.643 CC lib/event/log_rpc.o 00:01:54.643 CC lib/event/app_rpc.o 00:01:54.643 CC lib/event/scheduler_static.o 00:01:54.903 LIB libspdk_accel.a 00:01:54.904 SO libspdk_accel.so.15.0 00:01:54.904 LIB libspdk_nvme.a 00:01:54.904 SYMLINK libspdk_accel.so 00:01:55.164 SO libspdk_nvme.so.13.0 00:01:55.164 LIB libspdk_event.a 00:01:55.164 SO libspdk_event.so.13.1 00:01:55.164 SYMLINK libspdk_event.so 00:01:55.426 CC lib/bdev/bdev.o 00:01:55.426 CC lib/bdev/bdev_rpc.o 00:01:55.426 SYMLINK libspdk_nvme.so 00:01:55.426 CC lib/bdev/bdev_zone.o 00:01:55.426 CC lib/bdev/part.o 00:01:55.426 CC lib/bdev/scsi_nvme.o 00:01:56.369 LIB libspdk_blob.a 00:01:56.631 SO libspdk_blob.so.11.0 00:01:56.631 SYMLINK libspdk_blob.so 00:01:56.892 CC lib/blobfs/blobfs.o 00:01:56.892 CC lib/blobfs/tree.o 00:01:56.892 CC lib/lvol/lvol.o 00:01:57.465 LIB libspdk_bdev.a 00:01:57.727 SO libspdk_bdev.so.15.0 00:01:57.727 LIB libspdk_blobfs.a 00:01:57.727 SO libspdk_blobfs.so.10.0 00:01:57.727 SYMLINK libspdk_bdev.so 00:01:57.727 LIB libspdk_lvol.a 00:01:57.727 SYMLINK libspdk_blobfs.so 00:01:57.727 SO libspdk_lvol.so.10.0 00:01:57.988 SYMLINK libspdk_lvol.so 00:01:57.988 CC lib/scsi/dev.o 00:01:57.988 CC lib/scsi/lun.o 00:01:57.988 CC lib/scsi/port.o 00:01:57.988 CC lib/scsi/scsi.o 00:01:57.988 CC lib/scsi/scsi_bdev.o 00:01:57.988 CC lib/scsi/scsi_pr.o 00:01:57.988 CC lib/scsi/scsi_rpc.o 00:01:57.988 CC lib/ftl/ftl_core.o 00:01:57.988 CC lib/scsi/task.o 00:01:57.988 CC lib/ftl/ftl_init.o 00:01:57.988 CC lib/ftl/ftl_layout.o 00:01:57.988 CC lib/ftl/ftl_debug.o 00:01:57.988 CC lib/ftl/ftl_io.o 00:01:57.988 CC lib/nbd/nbd.o 00:01:57.988 CC lib/ftl/ftl_sb.o 00:01:57.988 CC lib/ftl/ftl_l2p.o 00:01:57.988 CC lib/ftl/ftl_l2p_flat.o 00:01:57.988 CC lib/ftl/ftl_nv_cache.o 00:01:57.988 CC lib/ftl/ftl_band.o 00:01:57.988 CC lib/nbd/nbd_rpc.o 00:01:57.988 CC lib/ftl/ftl_band_ops.o 00:01:57.988 CC lib/ftl/ftl_writer.o 00:01:57.988 CC lib/ublk/ublk.o 00:01:57.988 CC lib/ftl/ftl_rq.o 00:01:57.988 CC lib/ftl/ftl_reloc.o 00:01:57.988 CC lib/ublk/ublk_rpc.o 00:01:57.988 CC lib/ftl/ftl_l2p_cache.o 00:01:57.988 CC lib/ftl/ftl_p2l.o 00:01:57.988 CC lib/nvmf/ctrlr.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:01:57.988 CC lib/nvmf/ctrlr_discovery.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:01:57.988 CC lib/nvmf/ctrlr_bdev.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_startup.o 00:01:57.988 CC lib/nvmf/nvmf.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_md.o 00:01:57.988 CC lib/nvmf/subsystem.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_misc.o 00:01:57.988 CC lib/nvmf/nvmf_rpc.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:01:57.988 CC lib/nvmf/transport.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_band.o 00:01:57.988 CC lib/nvmf/tcp.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:01:57.988 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:01:57.988 CC lib/nvmf/stubs.o 00:01:58.246 CC lib/nvmf/mdns_server.o 00:01:58.246 CC lib/nvmf/rdma.o 00:01:58.246 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:01:58.246 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:01:58.246 CC lib/nvmf/auth.o 00:01:58.246 CC lib/ftl/utils/ftl_conf.o 00:01:58.246 CC lib/ftl/utils/ftl_md.o 00:01:58.246 CC lib/ftl/utils/ftl_mempool.o 00:01:58.246 CC lib/ftl/utils/ftl_bitmap.o 00:01:58.246 CC lib/ftl/utils/ftl_property.o 00:01:58.246 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:01:58.246 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:01:58.246 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:01:58.246 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:01:58.246 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:01:58.246 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:01:58.246 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:01:58.246 CC lib/ftl/upgrade/ftl_sb_v3.o 00:01:58.246 CC lib/ftl/upgrade/ftl_sb_v5.o 00:01:58.246 CC lib/ftl/nvc/ftl_nvc_dev.o 00:01:58.246 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:01:58.246 CC lib/ftl/base/ftl_base_dev.o 00:01:58.246 CC lib/ftl/base/ftl_base_bdev.o 00:01:58.246 CC lib/ftl/ftl_trace.o 00:01:58.505 LIB libspdk_nbd.a 00:01:58.505 SO libspdk_nbd.so.7.0 00:01:58.766 LIB libspdk_scsi.a 00:01:58.766 LIB libspdk_ublk.a 00:01:58.766 SYMLINK libspdk_nbd.so 00:01:58.766 SO libspdk_ublk.so.3.0 00:01:58.766 SO libspdk_scsi.so.9.0 00:01:58.766 SYMLINK libspdk_ublk.so 00:01:58.766 SYMLINK libspdk_scsi.so 00:01:59.027 LIB libspdk_ftl.a 00:01:59.027 CC lib/vhost/vhost.o 00:01:59.027 CC lib/vhost/vhost_scsi.o 00:01:59.027 CC lib/vhost/vhost_rpc.o 00:01:59.027 CC lib/vhost/vhost_blk.o 00:01:59.027 CC lib/vhost/rte_vhost_user.o 00:01:59.027 CC lib/iscsi/conn.o 00:01:59.027 CC lib/iscsi/init_grp.o 00:01:59.027 CC lib/iscsi/iscsi.o 00:01:59.027 CC lib/iscsi/md5.o 00:01:59.027 CC lib/iscsi/param.o 00:01:59.027 CC lib/iscsi/portal_grp.o 00:01:59.027 CC lib/iscsi/iscsi_subsystem.o 00:01:59.027 CC lib/iscsi/tgt_node.o 00:01:59.289 CC lib/iscsi/iscsi_rpc.o 00:01:59.289 CC lib/iscsi/task.o 00:01:59.289 SO libspdk_ftl.so.9.0 00:01:59.549 SYMLINK libspdk_ftl.so 00:01:59.809 LIB libspdk_nvmf.a 00:02:00.070 SO libspdk_nvmf.so.19.0 00:02:00.070 LIB libspdk_vhost.a 00:02:00.070 SO libspdk_vhost.so.8.0 00:02:00.331 SYMLINK libspdk_nvmf.so 00:02:00.331 SYMLINK libspdk_vhost.so 00:02:00.331 LIB libspdk_iscsi.a 00:02:00.331 SO libspdk_iscsi.so.8.0 00:02:00.592 SYMLINK libspdk_iscsi.so 00:02:01.166 CC module/env_dpdk/env_dpdk_rpc.o 00:02:01.166 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:01.166 LIB libspdk_env_dpdk_rpc.a 00:02:01.166 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:01.166 CC module/scheduler/gscheduler/gscheduler.o 00:02:01.166 CC module/sock/posix/posix.o 00:02:01.166 CC module/accel/ioat/accel_ioat.o 00:02:01.166 CC module/accel/ioat/accel_ioat_rpc.o 00:02:01.166 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:01.166 CC module/accel/iaa/accel_iaa_rpc.o 00:02:01.166 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:01.166 CC module/accel/iaa/accel_iaa.o 00:02:01.166 CC module/accel/error/accel_error.o 00:02:01.428 CC module/accel/error/accel_error_rpc.o 00:02:01.428 CC module/accel/dsa/accel_dsa.o 00:02:01.428 CC module/accel/dsa/accel_dsa_rpc.o 00:02:01.428 CC module/blob/bdev/blob_bdev.o 00:02:01.428 CC module/keyring/linux/keyring.o 00:02:01.428 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:01.428 CC module/keyring/file/keyring.o 00:02:01.428 CC module/keyring/linux/keyring_rpc.o 00:02:01.428 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:01.428 CC module/keyring/file/keyring_rpc.o 00:02:01.428 SO libspdk_env_dpdk_rpc.so.6.0 00:02:01.428 SYMLINK libspdk_env_dpdk_rpc.so 00:02:01.428 LIB libspdk_scheduler_dpdk_governor.a 00:02:01.428 LIB libspdk_scheduler_dynamic.a 00:02:01.428 LIB libspdk_scheduler_gscheduler.a 00:02:01.428 LIB libspdk_keyring_linux.a 00:02:01.428 SO libspdk_scheduler_dynamic.so.4.0 00:02:01.428 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:01.428 SO libspdk_scheduler_gscheduler.so.4.0 00:02:01.428 LIB libspdk_keyring_file.a 00:02:01.428 LIB libspdk_accel_error.a 00:02:01.428 SO libspdk_keyring_linux.so.1.0 00:02:01.428 LIB libspdk_accel_ioat.a 00:02:01.428 LIB libspdk_accel_iaa.a 00:02:01.428 SO libspdk_keyring_file.so.1.0 00:02:01.428 SO libspdk_accel_error.so.2.0 00:02:01.428 SYMLINK libspdk_scheduler_dynamic.so 00:02:01.428 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:01.428 SYMLINK libspdk_scheduler_gscheduler.so 00:02:01.428 SO libspdk_accel_ioat.so.6.0 00:02:01.428 LIB libspdk_accel_dsa.a 00:02:01.428 SO libspdk_accel_iaa.so.3.0 00:02:01.688 SYMLINK libspdk_keyring_linux.so 00:02:01.688 LIB libspdk_blob_bdev.a 00:02:01.688 SO libspdk_accel_dsa.so.5.0 00:02:01.688 SYMLINK libspdk_keyring_file.so 00:02:01.688 SO libspdk_blob_bdev.so.11.0 00:02:01.688 SYMLINK libspdk_accel_error.so 00:02:01.688 SYMLINK libspdk_accel_ioat.so 00:02:01.688 SYMLINK libspdk_accel_iaa.so 00:02:01.688 SYMLINK libspdk_accel_dsa.so 00:02:01.688 SYMLINK libspdk_blob_bdev.so 00:02:01.949 LIB libspdk_sock_posix.a 00:02:01.949 SO libspdk_sock_posix.so.6.0 00:02:01.949 SYMLINK libspdk_sock_posix.so 00:02:02.208 LIB libspdk_accel_dpdk_cryptodev.a 00:02:02.208 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:02.208 CC module/bdev/gpt/gpt.o 00:02:02.208 CC module/bdev/gpt/vbdev_gpt.o 00:02:02.208 CC module/blobfs/bdev/blobfs_bdev.o 00:02:02.208 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:02.208 CC module/bdev/lvol/vbdev_lvol.o 00:02:02.208 CC module/bdev/iscsi/bdev_iscsi.o 00:02:02.208 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:02.208 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:02.208 CC module/bdev/passthru/vbdev_passthru.o 00:02:02.208 LIB libspdk_accel_dpdk_compressdev.a 00:02:02.208 CC module/bdev/nvme/bdev_nvme.o 00:02:02.208 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:02.208 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:02.208 CC module/bdev/nvme/nvme_rpc.o 00:02:02.208 CC module/bdev/nvme/bdev_mdns_client.o 00:02:02.208 CC module/bdev/crypto/vbdev_crypto.o 00:02:02.208 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:02.208 CC module/bdev/nvme/vbdev_opal.o 00:02:02.208 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:02.208 CC module/bdev/malloc/bdev_malloc.o 00:02:02.208 CC module/bdev/split/vbdev_split.o 00:02:02.208 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:02.208 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:02.208 CC module/bdev/split/vbdev_split_rpc.o 00:02:02.208 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:02.208 CC module/bdev/compress/vbdev_compress.o 00:02:02.208 CC module/bdev/delay/vbdev_delay.o 00:02:02.208 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:02.208 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:02.208 CC module/bdev/error/vbdev_error_rpc.o 00:02:02.208 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:02.208 CC module/bdev/error/vbdev_error.o 00:02:02.208 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:02.208 CC module/bdev/aio/bdev_aio.o 00:02:02.208 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:02.208 CC module/bdev/aio/bdev_aio_rpc.o 00:02:02.208 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:02.208 CC module/bdev/raid/bdev_raid.o 00:02:02.208 CC module/bdev/raid/bdev_raid_rpc.o 00:02:02.208 CC module/bdev/raid/bdev_raid_sb.o 00:02:02.208 CC module/bdev/null/bdev_null.o 00:02:02.208 CC module/bdev/raid/raid0.o 00:02:02.208 CC module/bdev/raid/concat.o 00:02:02.208 CC module/bdev/null/bdev_null_rpc.o 00:02:02.208 CC module/bdev/raid/raid1.o 00:02:02.208 CC module/bdev/ftl/bdev_ftl.o 00:02:02.208 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:02.208 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:02.208 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:02.208 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:02.468 LIB libspdk_blobfs_bdev.a 00:02:02.468 SO libspdk_blobfs_bdev.so.6.0 00:02:02.468 LIB libspdk_bdev_split.a 00:02:02.469 LIB libspdk_bdev_gpt.a 00:02:02.469 SO libspdk_bdev_split.so.6.0 00:02:02.469 SYMLINK libspdk_blobfs_bdev.so 00:02:02.469 SO libspdk_bdev_gpt.so.6.0 00:02:02.469 LIB libspdk_bdev_null.a 00:02:02.469 LIB libspdk_bdev_passthru.a 00:02:02.469 LIB libspdk_bdev_error.a 00:02:02.469 LIB libspdk_bdev_ftl.a 00:02:02.469 SO libspdk_bdev_null.so.6.0 00:02:02.469 SYMLINK libspdk_bdev_split.so 00:02:02.469 SO libspdk_bdev_error.so.6.0 00:02:02.469 SO libspdk_bdev_ftl.so.6.0 00:02:02.469 SO libspdk_bdev_passthru.so.6.0 00:02:02.469 LIB libspdk_bdev_iscsi.a 00:02:02.469 SYMLINK libspdk_bdev_gpt.so 00:02:02.469 LIB libspdk_bdev_zone_block.a 00:02:02.469 LIB libspdk_bdev_malloc.a 00:02:02.469 LIB libspdk_bdev_aio.a 00:02:02.469 LIB libspdk_bdev_crypto.a 00:02:02.469 SO libspdk_bdev_iscsi.so.6.0 00:02:02.469 SYMLINK libspdk_bdev_null.so 00:02:02.469 LIB libspdk_bdev_delay.a 00:02:02.469 SO libspdk_bdev_zone_block.so.6.0 00:02:02.469 LIB libspdk_bdev_compress.a 00:02:02.469 SYMLINK libspdk_bdev_passthru.so 00:02:02.469 SO libspdk_bdev_malloc.so.6.0 00:02:02.469 SO libspdk_bdev_aio.so.6.0 00:02:02.730 SYMLINK libspdk_bdev_error.so 00:02:02.730 SO libspdk_bdev_crypto.so.6.0 00:02:02.730 SYMLINK libspdk_bdev_ftl.so 00:02:02.730 SO libspdk_bdev_compress.so.6.0 00:02:02.730 SO libspdk_bdev_delay.so.6.0 00:02:02.730 SYMLINK libspdk_bdev_zone_block.so 00:02:02.730 SYMLINK libspdk_bdev_iscsi.so 00:02:02.730 LIB libspdk_bdev_lvol.a 00:02:02.730 SYMLINK libspdk_bdev_aio.so 00:02:02.730 SYMLINK libspdk_bdev_malloc.so 00:02:02.730 SYMLINK libspdk_bdev_delay.so 00:02:02.730 SYMLINK libspdk_bdev_compress.so 00:02:02.730 SYMLINK libspdk_bdev_crypto.so 00:02:02.730 SO libspdk_bdev_lvol.so.6.0 00:02:02.730 LIB libspdk_bdev_virtio.a 00:02:02.730 SO libspdk_bdev_virtio.so.6.0 00:02:02.730 SYMLINK libspdk_bdev_lvol.so 00:02:02.730 SYMLINK libspdk_bdev_virtio.so 00:02:02.991 LIB libspdk_bdev_raid.a 00:02:02.991 SO libspdk_bdev_raid.so.6.0 00:02:03.252 SYMLINK libspdk_bdev_raid.so 00:02:04.196 LIB libspdk_bdev_nvme.a 00:02:04.196 SO libspdk_bdev_nvme.so.7.0 00:02:04.196 SYMLINK libspdk_bdev_nvme.so 00:02:04.768 CC module/event/subsystems/scheduler/scheduler.o 00:02:05.030 CC module/event/subsystems/sock/sock.o 00:02:05.030 CC module/event/subsystems/vmd/vmd.o 00:02:05.030 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:05.030 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:05.030 CC module/event/subsystems/iobuf/iobuf.o 00:02:05.030 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:05.030 CC module/event/subsystems/keyring/keyring.o 00:02:05.030 LIB libspdk_event_scheduler.a 00:02:05.030 LIB libspdk_event_vhost_blk.a 00:02:05.030 LIB libspdk_event_sock.a 00:02:05.030 LIB libspdk_event_keyring.a 00:02:05.030 LIB libspdk_event_vmd.a 00:02:05.030 LIB libspdk_event_iobuf.a 00:02:05.030 SO libspdk_event_scheduler.so.4.0 00:02:05.030 SO libspdk_event_vhost_blk.so.3.0 00:02:05.030 SO libspdk_event_sock.so.5.0 00:02:05.030 SO libspdk_event_keyring.so.1.0 00:02:05.030 SO libspdk_event_vmd.so.6.0 00:02:05.308 SO libspdk_event_iobuf.so.3.0 00:02:05.308 SYMLINK libspdk_event_scheduler.so 00:02:05.308 SYMLINK libspdk_event_vhost_blk.so 00:02:05.308 SYMLINK libspdk_event_sock.so 00:02:05.308 SYMLINK libspdk_event_keyring.so 00:02:05.308 SYMLINK libspdk_event_vmd.so 00:02:05.308 SYMLINK libspdk_event_iobuf.so 00:02:05.610 CC module/event/subsystems/accel/accel.o 00:02:05.610 LIB libspdk_event_accel.a 00:02:05.906 SO libspdk_event_accel.so.6.0 00:02:05.906 SYMLINK libspdk_event_accel.so 00:02:06.166 CC module/event/subsystems/bdev/bdev.o 00:02:06.426 LIB libspdk_event_bdev.a 00:02:06.426 SO libspdk_event_bdev.so.6.0 00:02:06.426 SYMLINK libspdk_event_bdev.so 00:02:06.686 CC module/event/subsystems/ublk/ublk.o 00:02:06.947 CC module/event/subsystems/scsi/scsi.o 00:02:06.947 CC module/event/subsystems/nbd/nbd.o 00:02:06.947 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:06.947 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:06.947 LIB libspdk_event_ublk.a 00:02:06.947 LIB libspdk_event_nbd.a 00:02:06.947 LIB libspdk_event_scsi.a 00:02:06.947 SO libspdk_event_ublk.so.3.0 00:02:06.947 SO libspdk_event_nbd.so.6.0 00:02:06.947 SO libspdk_event_scsi.so.6.0 00:02:06.947 LIB libspdk_event_nvmf.a 00:02:07.208 SYMLINK libspdk_event_ublk.so 00:02:07.208 SYMLINK libspdk_event_nbd.so 00:02:07.208 SYMLINK libspdk_event_scsi.so 00:02:07.208 SO libspdk_event_nvmf.so.6.0 00:02:07.208 SYMLINK libspdk_event_nvmf.so 00:02:07.469 CC module/event/subsystems/iscsi/iscsi.o 00:02:07.469 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:07.729 LIB libspdk_event_vhost_scsi.a 00:02:07.729 LIB libspdk_event_iscsi.a 00:02:07.729 SO libspdk_event_vhost_scsi.so.3.0 00:02:07.729 SO libspdk_event_iscsi.so.6.0 00:02:07.729 SYMLINK libspdk_event_vhost_scsi.so 00:02:07.729 SYMLINK libspdk_event_iscsi.so 00:02:07.990 SO libspdk.so.6.0 00:02:07.990 SYMLINK libspdk.so 00:02:08.252 CC app/spdk_top/spdk_top.o 00:02:08.252 CC app/trace_record/trace_record.o 00:02:08.252 CC app/spdk_lspci/spdk_lspci.o 00:02:08.252 CC app/spdk_nvme_perf/perf.o 00:02:08.252 CXX app/trace/trace.o 00:02:08.252 CC app/spdk_nvme_identify/identify.o 00:02:08.252 CC app/spdk_nvme_discover/discovery_aer.o 00:02:08.252 CC test/rpc_client/rpc_client_test.o 00:02:08.528 TEST_HEADER include/spdk/accel.h 00:02:08.528 TEST_HEADER include/spdk/assert.h 00:02:08.528 TEST_HEADER include/spdk/accel_module.h 00:02:08.528 CC app/spdk_dd/spdk_dd.o 00:02:08.528 TEST_HEADER include/spdk/base64.h 00:02:08.528 TEST_HEADER include/spdk/bdev.h 00:02:08.528 TEST_HEADER include/spdk/bdev_module.h 00:02:08.528 TEST_HEADER include/spdk/barrier.h 00:02:08.528 TEST_HEADER include/spdk/bdev_zone.h 00:02:08.528 TEST_HEADER include/spdk/bit_array.h 00:02:08.528 TEST_HEADER include/spdk/blob_bdev.h 00:02:08.528 CC app/iscsi_tgt/iscsi_tgt.o 00:02:08.528 CC app/nvmf_tgt/nvmf_main.o 00:02:08.528 TEST_HEADER include/spdk/blobfs.h 00:02:08.528 TEST_HEADER include/spdk/bit_pool.h 00:02:08.528 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:08.528 TEST_HEADER include/spdk/blob.h 00:02:08.528 TEST_HEADER include/spdk/conf.h 00:02:08.528 TEST_HEADER include/spdk/config.h 00:02:08.528 TEST_HEADER include/spdk/cpuset.h 00:02:08.528 TEST_HEADER include/spdk/crc32.h 00:02:08.528 TEST_HEADER include/spdk/crc64.h 00:02:08.528 TEST_HEADER include/spdk/crc16.h 00:02:08.528 TEST_HEADER include/spdk/dif.h 00:02:08.528 TEST_HEADER include/spdk/dma.h 00:02:08.528 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:08.528 CC app/vhost/vhost.o 00:02:08.528 TEST_HEADER include/spdk/endian.h 00:02:08.528 TEST_HEADER include/spdk/env.h 00:02:08.528 TEST_HEADER include/spdk/env_dpdk.h 00:02:08.528 TEST_HEADER include/spdk/fd_group.h 00:02:08.528 TEST_HEADER include/spdk/fd.h 00:02:08.528 TEST_HEADER include/spdk/event.h 00:02:08.528 TEST_HEADER include/spdk/file.h 00:02:08.528 CC app/spdk_tgt/spdk_tgt.o 00:02:08.528 TEST_HEADER include/spdk/ftl.h 00:02:08.528 TEST_HEADER include/spdk/hexlify.h 00:02:08.528 TEST_HEADER include/spdk/gpt_spec.h 00:02:08.528 TEST_HEADER include/spdk/idxd.h 00:02:08.528 TEST_HEADER include/spdk/histogram_data.h 00:02:08.528 TEST_HEADER include/spdk/idxd_spec.h 00:02:08.528 TEST_HEADER include/spdk/init.h 00:02:08.528 TEST_HEADER include/spdk/ioat.h 00:02:08.528 TEST_HEADER include/spdk/iscsi_spec.h 00:02:08.528 TEST_HEADER include/spdk/ioat_spec.h 00:02:08.528 TEST_HEADER include/spdk/jsonrpc.h 00:02:08.528 TEST_HEADER include/spdk/keyring.h 00:02:08.528 TEST_HEADER include/spdk/json.h 00:02:08.528 TEST_HEADER include/spdk/keyring_module.h 00:02:08.528 TEST_HEADER include/spdk/likely.h 00:02:08.528 TEST_HEADER include/spdk/log.h 00:02:08.528 TEST_HEADER include/spdk/lvol.h 00:02:08.528 TEST_HEADER include/spdk/mmio.h 00:02:08.528 TEST_HEADER include/spdk/memory.h 00:02:08.528 TEST_HEADER include/spdk/nbd.h 00:02:08.528 TEST_HEADER include/spdk/notify.h 00:02:08.528 TEST_HEADER include/spdk/nvme.h 00:02:08.528 TEST_HEADER include/spdk/nvme_intel.h 00:02:08.528 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:08.528 TEST_HEADER include/spdk/nvme_spec.h 00:02:08.528 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:08.528 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:08.528 TEST_HEADER include/spdk/nvme_zns.h 00:02:08.528 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:08.528 TEST_HEADER include/spdk/nvmf_spec.h 00:02:08.528 TEST_HEADER include/spdk/nvmf.h 00:02:08.528 TEST_HEADER include/spdk/opal.h 00:02:08.528 TEST_HEADER include/spdk/nvmf_transport.h 00:02:08.528 TEST_HEADER include/spdk/opal_spec.h 00:02:08.528 TEST_HEADER include/spdk/pci_ids.h 00:02:08.528 TEST_HEADER include/spdk/pipe.h 00:02:08.528 TEST_HEADER include/spdk/reduce.h 00:02:08.528 TEST_HEADER include/spdk/queue.h 00:02:08.528 TEST_HEADER include/spdk/rpc.h 00:02:08.528 TEST_HEADER include/spdk/scsi.h 00:02:08.528 TEST_HEADER include/spdk/scheduler.h 00:02:08.528 TEST_HEADER include/spdk/scsi_spec.h 00:02:08.528 TEST_HEADER include/spdk/stdinc.h 00:02:08.528 TEST_HEADER include/spdk/sock.h 00:02:08.528 TEST_HEADER include/spdk/string.h 00:02:08.528 TEST_HEADER include/spdk/thread.h 00:02:08.528 TEST_HEADER include/spdk/trace.h 00:02:08.528 TEST_HEADER include/spdk/trace_parser.h 00:02:08.528 TEST_HEADER include/spdk/tree.h 00:02:08.528 TEST_HEADER include/spdk/ublk.h 00:02:08.528 TEST_HEADER include/spdk/util.h 00:02:08.528 TEST_HEADER include/spdk/uuid.h 00:02:08.528 TEST_HEADER include/spdk/version.h 00:02:08.528 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:08.528 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:08.528 TEST_HEADER include/spdk/vmd.h 00:02:08.528 TEST_HEADER include/spdk/vhost.h 00:02:08.528 TEST_HEADER include/spdk/zipf.h 00:02:08.528 TEST_HEADER include/spdk/xor.h 00:02:08.528 CXX test/cpp_headers/accel.o 00:02:08.528 CXX test/cpp_headers/accel_module.o 00:02:08.528 CXX test/cpp_headers/assert.o 00:02:08.528 CXX test/cpp_headers/base64.o 00:02:08.528 CXX test/cpp_headers/barrier.o 00:02:08.528 CXX test/cpp_headers/bdev.o 00:02:08.528 CXX test/cpp_headers/bdev_zone.o 00:02:08.528 CXX test/cpp_headers/bit_array.o 00:02:08.528 CXX test/cpp_headers/bdev_module.o 00:02:08.528 CXX test/cpp_headers/blob_bdev.o 00:02:08.528 CXX test/cpp_headers/bit_pool.o 00:02:08.528 CXX test/cpp_headers/blobfs.o 00:02:08.528 CXX test/cpp_headers/blobfs_bdev.o 00:02:08.528 CXX test/cpp_headers/blob.o 00:02:08.528 CXX test/cpp_headers/conf.o 00:02:08.528 CXX test/cpp_headers/config.o 00:02:08.528 CXX test/cpp_headers/cpuset.o 00:02:08.528 CXX test/cpp_headers/crc16.o 00:02:08.528 CXX test/cpp_headers/crc32.o 00:02:08.528 CXX test/cpp_headers/dma.o 00:02:08.528 CXX test/cpp_headers/dif.o 00:02:08.528 CXX test/cpp_headers/crc64.o 00:02:08.528 CXX test/cpp_headers/endian.o 00:02:08.528 CXX test/cpp_headers/env.o 00:02:08.528 CXX test/cpp_headers/event.o 00:02:08.528 CXX test/cpp_headers/fd_group.o 00:02:08.528 CXX test/cpp_headers/env_dpdk.o 00:02:08.528 CXX test/cpp_headers/fd.o 00:02:08.528 CXX test/cpp_headers/ftl.o 00:02:08.528 CXX test/cpp_headers/file.o 00:02:08.528 CXX test/cpp_headers/hexlify.o 00:02:08.528 CXX test/cpp_headers/gpt_spec.o 00:02:08.528 CXX test/cpp_headers/histogram_data.o 00:02:08.528 CXX test/cpp_headers/idxd.o 00:02:08.528 CXX test/cpp_headers/idxd_spec.o 00:02:08.528 CXX test/cpp_headers/init.o 00:02:08.528 CXX test/cpp_headers/iscsi_spec.o 00:02:08.528 CXX test/cpp_headers/ioat.o 00:02:08.528 CXX test/cpp_headers/ioat_spec.o 00:02:08.528 CXX test/cpp_headers/keyring.o 00:02:08.529 CXX test/cpp_headers/json.o 00:02:08.529 CXX test/cpp_headers/keyring_module.o 00:02:08.529 CXX test/cpp_headers/jsonrpc.o 00:02:08.529 CXX test/cpp_headers/likely.o 00:02:08.529 CXX test/cpp_headers/memory.o 00:02:08.529 CXX test/cpp_headers/lvol.o 00:02:08.529 CXX test/cpp_headers/log.o 00:02:08.529 CXX test/cpp_headers/mmio.o 00:02:08.529 CXX test/cpp_headers/nbd.o 00:02:08.529 CXX test/cpp_headers/notify.o 00:02:08.529 CXX test/cpp_headers/nvme.o 00:02:08.529 CXX test/cpp_headers/nvme_intel.o 00:02:08.529 CXX test/cpp_headers/nvme_ocssd.o 00:02:08.529 CC test/event/reactor/reactor.o 00:02:08.529 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:08.529 CXX test/cpp_headers/nvme_zns.o 00:02:08.529 CXX test/cpp_headers/nvme_spec.o 00:02:08.529 CXX test/cpp_headers/nvmf_cmd.o 00:02:08.529 CXX test/cpp_headers/nvmf.o 00:02:08.529 CXX test/cpp_headers/nvmf_spec.o 00:02:08.529 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:08.529 CC examples/nvme/reconnect/reconnect.o 00:02:08.529 CC examples/ioat/perf/perf.o 00:02:08.529 CXX test/cpp_headers/nvmf_transport.o 00:02:08.529 CXX test/cpp_headers/opal.o 00:02:08.529 CC examples/ioat/verify/verify.o 00:02:08.529 CC examples/sock/hello_world/hello_sock.o 00:02:08.529 CC examples/accel/perf/accel_perf.o 00:02:08.529 CXX test/cpp_headers/opal_spec.o 00:02:08.529 CXX test/cpp_headers/reduce.o 00:02:08.529 CXX test/cpp_headers/pci_ids.o 00:02:08.529 CXX test/cpp_headers/pipe.o 00:02:08.529 CXX test/cpp_headers/queue.o 00:02:08.529 LINK spdk_lspci 00:02:08.529 CXX test/cpp_headers/scheduler.o 00:02:08.529 CC test/event/event_perf/event_perf.o 00:02:08.529 CXX test/cpp_headers/rpc.o 00:02:08.529 CC examples/nvme/hello_world/hello_world.o 00:02:08.529 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:08.529 CC test/nvme/aer/aer.o 00:02:08.529 CC examples/vmd/lsvmd/lsvmd.o 00:02:08.529 CC test/env/vtophys/vtophys.o 00:02:08.529 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:08.529 CC test/nvme/err_injection/err_injection.o 00:02:08.529 CC test/nvme/e2edp/nvme_dp.o 00:02:08.529 CC test/nvme/connect_stress/connect_stress.o 00:02:08.529 CC test/nvme/startup/startup.o 00:02:08.529 CC test/app/jsoncat/jsoncat.o 00:02:08.529 CC test/nvme/boot_partition/boot_partition.o 00:02:08.529 CC examples/nvme/hotplug/hotplug.o 00:02:08.529 CC test/thread/poller_perf/poller_perf.o 00:02:08.529 CC test/nvme/overhead/overhead.o 00:02:08.529 CC test/nvme/simple_copy/simple_copy.o 00:02:08.529 CC test/nvme/reset/reset.o 00:02:08.529 CC app/fio/nvme/fio_plugin.o 00:02:08.529 CC test/nvme/sgl/sgl.o 00:02:08.529 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:08.529 CC test/nvme/compliance/nvme_compliance.o 00:02:08.529 CC test/nvme/cuse/cuse.o 00:02:08.529 CC test/app/histogram_perf/histogram_perf.o 00:02:08.529 CC examples/nvme/abort/abort.o 00:02:08.529 CC examples/nvme/arbitration/arbitration.o 00:02:08.529 CC test/event/reactor_perf/reactor_perf.o 00:02:08.796 CC test/env/memory/memory_ut.o 00:02:08.796 CC test/bdev/bdevio/bdevio.o 00:02:08.796 CC examples/idxd/perf/perf.o 00:02:08.796 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:08.796 CC examples/vmd/led/led.o 00:02:08.796 CXX test/cpp_headers/scsi.o 00:02:08.796 CC test/nvme/reserve/reserve.o 00:02:08.796 CC examples/util/zipf/zipf.o 00:02:08.796 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:08.796 CC examples/bdev/hello_world/hello_bdev.o 00:02:08.796 CC test/accel/dif/dif.o 00:02:08.796 CC test/env/pci/pci_ut.o 00:02:08.796 CC test/nvme/fdp/fdp.o 00:02:08.796 CC test/event/app_repeat/app_repeat.o 00:02:08.796 CC test/nvme/fused_ordering/fused_ordering.o 00:02:08.796 CC examples/util/tls_psk/tls_psk_print.o 00:02:08.796 CC examples/blob/hello_world/hello_blob.o 00:02:08.796 CC test/dma/test_dma/test_dma.o 00:02:08.796 CC test/blobfs/mkfs/mkfs.o 00:02:08.796 CC test/app/stub/stub.o 00:02:08.796 CC app/fio/bdev/fio_plugin.o 00:02:08.796 CC examples/blob/cli/blobcli.o 00:02:08.796 CC examples/nvmf/nvmf/nvmf.o 00:02:08.796 CC test/app/bdev_svc/bdev_svc.o 00:02:08.796 CC examples/bdev/bdevperf/bdevperf.o 00:02:08.796 CXX test/cpp_headers/scsi_spec.o 00:02:08.796 CC test/event/scheduler/scheduler.o 00:02:08.796 CC examples/thread/thread/thread_ex.o 00:02:09.060 LINK rpc_client_test 00:02:09.060 LINK nvmf_tgt 00:02:09.060 LINK iscsi_tgt 00:02:09.060 LINK spdk_nvme_discover 00:02:09.060 LINK interrupt_tgt 00:02:09.060 LINK vhost 00:02:09.060 CC test/lvol/esnap/esnap.o 00:02:09.060 CC test/env/mem_callbacks/mem_callbacks.o 00:02:09.060 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:09.060 LINK spdk_trace_record 00:02:09.060 LINK spdk_tgt 00:02:09.318 CXX test/cpp_headers/sock.o 00:02:09.318 LINK lsvmd 00:02:09.318 LINK boot_partition 00:02:09.318 LINK jsoncat 00:02:09.318 LINK event_perf 00:02:09.318 LINK reactor 00:02:09.318 LINK cmb_copy 00:02:09.318 LINK err_injection 00:02:09.318 LINK startup 00:02:09.318 LINK vtophys 00:02:09.318 LINK zipf 00:02:09.318 LINK env_dpdk_post_init 00:02:09.318 LINK spdk_dd 00:02:09.318 LINK app_repeat 00:02:09.318 LINK doorbell_aers 00:02:09.318 LINK histogram_perf 00:02:09.318 LINK poller_perf 00:02:09.318 LINK reactor_perf 00:02:09.318 LINK connect_stress 00:02:09.318 LINK pmr_persistence 00:02:09.318 LINK led 00:02:09.318 LINK hotplug 00:02:09.318 LINK ioat_perf 00:02:09.318 CXX test/cpp_headers/stdinc.o 00:02:09.318 CXX test/cpp_headers/string.o 00:02:09.318 LINK bdev_svc 00:02:09.318 CXX test/cpp_headers/thread.o 00:02:09.318 CXX test/cpp_headers/trace.o 00:02:09.318 LINK stub 00:02:09.318 LINK mkfs 00:02:09.318 CXX test/cpp_headers/trace_parser.o 00:02:09.318 CXX test/cpp_headers/tree.o 00:02:09.318 CXX test/cpp_headers/ublk.o 00:02:09.318 CXX test/cpp_headers/util.o 00:02:09.318 CXX test/cpp_headers/uuid.o 00:02:09.318 LINK verify 00:02:09.318 LINK reserve 00:02:09.318 CXX test/cpp_headers/version.o 00:02:09.318 CXX test/cpp_headers/vfio_user_pci.o 00:02:09.318 CXX test/cpp_headers/vfio_user_spec.o 00:02:09.318 CXX test/cpp_headers/vhost.o 00:02:09.318 CXX test/cpp_headers/vmd.o 00:02:09.318 CXX test/cpp_headers/xor.o 00:02:09.318 CXX test/cpp_headers/zipf.o 00:02:09.318 LINK hello_world 00:02:09.318 LINK nvme_dp 00:02:09.318 LINK simple_copy 00:02:09.318 LINK fused_ordering 00:02:09.318 LINK hello_sock 00:02:09.318 LINK hello_bdev 00:02:09.577 LINK hello_blob 00:02:09.577 LINK sgl 00:02:09.577 LINK aer 00:02:09.577 LINK reset 00:02:09.577 LINK scheduler 00:02:09.577 LINK reconnect 00:02:09.577 LINK overhead 00:02:09.577 LINK nvme_compliance 00:02:09.577 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:09.577 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:09.577 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:09.577 LINK thread 00:02:09.577 LINK spdk_trace 00:02:09.577 LINK abort 00:02:09.577 LINK idxd_perf 00:02:09.577 LINK nvmf 00:02:09.577 LINK test_dma 00:02:09.578 LINK arbitration 00:02:09.578 LINK fdp 00:02:09.578 LINK bdevio 00:02:09.578 LINK tls_psk_print 00:02:09.578 LINK accel_perf 00:02:09.578 LINK pci_ut 00:02:09.838 LINK blobcli 00:02:09.838 LINK nvme_manage 00:02:09.838 LINK dif 00:02:09.838 LINK spdk_bdev 00:02:09.838 LINK nvme_fuzz 00:02:09.838 LINK spdk_nvme 00:02:09.838 LINK spdk_nvme_perf 00:02:09.838 LINK spdk_nvme_identify 00:02:09.838 LINK spdk_top 00:02:09.838 LINK mem_callbacks 00:02:10.099 LINK vhost_fuzz 00:02:10.099 LINK bdevperf 00:02:10.099 LINK memory_ut 00:02:10.360 LINK cuse 00:02:10.930 LINK iscsi_fuzz 00:02:13.473 LINK esnap 00:02:13.735 00:02:13.735 real 1m16.293s 00:02:13.735 user 13m24.812s 00:02:13.735 sys 5m34.777s 00:02:13.735 13:30:28 make -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:02:13.735 13:30:28 make -- common/autotest_common.sh@10 -- $ set +x 00:02:13.735 ************************************ 00:02:13.735 END TEST make 00:02:13.735 ************************************ 00:02:13.735 13:30:28 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:13.735 13:30:28 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:13.735 13:30:28 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:13.735 13:30:28 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.735 13:30:28 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:13.735 13:30:28 -- pm/common@44 -- $ pid=1299040 00:02:13.735 13:30:28 -- pm/common@50 -- $ kill -TERM 1299040 00:02:13.735 13:30:28 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.735 13:30:28 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:13.735 13:30:28 -- pm/common@44 -- $ pid=1299041 00:02:13.735 13:30:28 -- pm/common@50 -- $ kill -TERM 1299041 00:02:13.735 13:30:28 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.735 13:30:28 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:13.735 13:30:28 -- pm/common@44 -- $ pid=1299043 00:02:13.735 13:30:28 -- pm/common@50 -- $ kill -TERM 1299043 00:02:13.735 13:30:28 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.735 13:30:28 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:13.735 13:30:28 -- pm/common@44 -- $ pid=1299067 00:02:13.735 13:30:28 -- pm/common@50 -- $ sudo -E kill -TERM 1299067 00:02:13.997 13:30:28 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:13.997 13:30:28 -- nvmf/common.sh@7 -- # uname -s 00:02:13.997 13:30:28 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:13.997 13:30:28 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:13.997 13:30:28 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:13.997 13:30:28 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:13.997 13:30:28 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:13.997 13:30:28 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:13.997 13:30:28 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:13.997 13:30:28 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:13.997 13:30:28 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:13.997 13:30:28 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:13.997 13:30:28 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:806f5428-4aec-ec11-9bc7-a4bf01928306 00:02:13.997 13:30:28 -- nvmf/common.sh@18 -- # NVME_HOSTID=806f5428-4aec-ec11-9bc7-a4bf01928306 00:02:13.997 13:30:28 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:13.997 13:30:28 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:13.997 13:30:28 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:13.997 13:30:28 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:13.997 13:30:28 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:13.997 13:30:28 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:13.997 13:30:28 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:13.997 13:30:28 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:13.997 13:30:28 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.997 13:30:28 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.997 13:30:28 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.997 13:30:28 -- paths/export.sh@5 -- # export PATH 00:02:13.997 13:30:28 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:13.997 13:30:28 -- nvmf/common.sh@47 -- # : 0 00:02:13.997 13:30:28 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:13.997 13:30:28 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:13.997 13:30:28 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:13.997 13:30:28 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:13.997 13:30:28 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:13.997 13:30:28 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:13.997 13:30:28 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:13.997 13:30:28 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:13.997 13:30:28 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:13.997 13:30:28 -- spdk/autotest.sh@32 -- # uname -s 00:02:13.997 13:30:28 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:13.997 13:30:28 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:13.997 13:30:28 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:13.997 13:30:28 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:13.997 13:30:28 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:13.997 13:30:28 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:13.997 13:30:28 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:13.997 13:30:28 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:13.997 13:30:28 -- spdk/autotest.sh@48 -- # udevadm_pid=1370589 00:02:13.997 13:30:28 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:13.997 13:30:28 -- pm/common@17 -- # local monitor 00:02:13.997 13:30:28 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:13.997 13:30:28 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.997 13:30:28 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.997 13:30:28 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.997 13:30:28 -- pm/common@21 -- # date +%s 00:02:13.997 13:30:28 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.997 13:30:28 -- pm/common@21 -- # date +%s 00:02:13.997 13:30:28 -- pm/common@25 -- # sleep 1 00:02:13.997 13:30:28 -- pm/common@21 -- # date +%s 00:02:13.997 13:30:28 -- pm/common@21 -- # date +%s 00:02:13.997 13:30:28 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718019028 00:02:13.997 13:30:28 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718019028 00:02:13.998 13:30:28 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718019028 00:02:13.998 13:30:28 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718019028 00:02:13.998 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718019028_collect-vmstat.pm.log 00:02:13.998 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718019028_collect-cpu-load.pm.log 00:02:13.998 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718019028_collect-cpu-temp.pm.log 00:02:13.998 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718019028_collect-bmc-pm.bmc.pm.log 00:02:14.941 13:30:29 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:14.941 13:30:29 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:14.941 13:30:29 -- common/autotest_common.sh@723 -- # xtrace_disable 00:02:14.941 13:30:29 -- common/autotest_common.sh@10 -- # set +x 00:02:14.941 13:30:29 -- spdk/autotest.sh@59 -- # create_test_list 00:02:14.941 13:30:29 -- common/autotest_common.sh@747 -- # xtrace_disable 00:02:14.941 13:30:29 -- common/autotest_common.sh@10 -- # set +x 00:02:14.941 13:30:29 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:14.941 13:30:29 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:14.941 13:30:29 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:14.941 13:30:29 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:14.941 13:30:29 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:14.941 13:30:29 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:14.941 13:30:29 -- common/autotest_common.sh@1454 -- # uname 00:02:14.941 13:30:29 -- common/autotest_common.sh@1454 -- # '[' Linux = FreeBSD ']' 00:02:14.941 13:30:29 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:14.941 13:30:29 -- common/autotest_common.sh@1474 -- # uname 00:02:15.203 13:30:29 -- common/autotest_common.sh@1474 -- # [[ Linux = FreeBSD ]] 00:02:15.203 13:30:29 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:15.203 13:30:29 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:15.203 13:30:29 -- spdk/autotest.sh@72 -- # hash lcov 00:02:15.203 13:30:29 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:15.203 13:30:29 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:15.203 --rc lcov_branch_coverage=1 00:02:15.203 --rc lcov_function_coverage=1 00:02:15.203 --rc genhtml_branch_coverage=1 00:02:15.203 --rc genhtml_function_coverage=1 00:02:15.203 --rc genhtml_legend=1 00:02:15.203 --rc geninfo_all_blocks=1 00:02:15.203 ' 00:02:15.203 13:30:29 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:15.203 --rc lcov_branch_coverage=1 00:02:15.203 --rc lcov_function_coverage=1 00:02:15.203 --rc genhtml_branch_coverage=1 00:02:15.203 --rc genhtml_function_coverage=1 00:02:15.203 --rc genhtml_legend=1 00:02:15.203 --rc geninfo_all_blocks=1 00:02:15.203 ' 00:02:15.203 13:30:29 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:15.203 --rc lcov_branch_coverage=1 00:02:15.203 --rc lcov_function_coverage=1 00:02:15.203 --rc genhtml_branch_coverage=1 00:02:15.203 --rc genhtml_function_coverage=1 00:02:15.203 --rc genhtml_legend=1 00:02:15.203 --rc geninfo_all_blocks=1 00:02:15.203 --no-external' 00:02:15.203 13:30:29 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:15.203 --rc lcov_branch_coverage=1 00:02:15.203 --rc lcov_function_coverage=1 00:02:15.203 --rc genhtml_branch_coverage=1 00:02:15.203 --rc genhtml_function_coverage=1 00:02:15.203 --rc genhtml_legend=1 00:02:15.203 --rc geninfo_all_blocks=1 00:02:15.203 --no-external' 00:02:15.203 13:30:29 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:15.203 lcov: LCOV version 1.14 00:02:15.203 13:30:29 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:27.441 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:27.441 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:02:42.359 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:42.360 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:42.360 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:02:42.361 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:02:42.361 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:02:44.280 13:30:58 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:44.280 13:30:58 -- common/autotest_common.sh@723 -- # xtrace_disable 00:02:44.280 13:30:58 -- common/autotest_common.sh@10 -- # set +x 00:02:44.281 13:30:58 -- spdk/autotest.sh@91 -- # rm -f 00:02:44.281 13:30:58 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:02:48.488 0000:80:01.6 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.7 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.4 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.5 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.2 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.3 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.0 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:80:01.1 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:65:00.0 (144d a80a): Already using the nvme driver 00:02:48.488 0000:00:01.6 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.7 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.4 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.5 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.2 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.3 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.0 (8086 0b00): Already using the ioatdma driver 00:02:48.488 0000:00:01.1 (8086 0b00): Already using the ioatdma driver 00:02:48.488 13:31:02 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:48.488 13:31:02 -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:02:48.488 13:31:02 -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:02:48.488 13:31:02 -- common/autotest_common.sh@1669 -- # local nvme bdf 00:02:48.488 13:31:02 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:02:48.488 13:31:02 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:02:48.488 13:31:02 -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:02:48.488 13:31:02 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:48.488 13:31:02 -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:02:48.488 13:31:02 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:48.488 13:31:02 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:48.488 13:31:02 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:48.488 13:31:02 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:48.488 13:31:02 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:48.488 13:31:02 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:48.488 No valid GPT data, bailing 00:02:48.488 13:31:02 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:48.488 13:31:02 -- scripts/common.sh@391 -- # pt= 00:02:48.488 13:31:02 -- scripts/common.sh@392 -- # return 1 00:02:48.488 13:31:02 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:48.749 1+0 records in 00:02:48.749 1+0 records out 00:02:48.749 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00446597 s, 235 MB/s 00:02:48.749 13:31:02 -- spdk/autotest.sh@118 -- # sync 00:02:48.749 13:31:02 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:48.749 13:31:02 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:48.750 13:31:02 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:56.893 13:31:10 -- spdk/autotest.sh@124 -- # uname -s 00:02:56.893 13:31:10 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:56.893 13:31:10 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:02:56.893 13:31:10 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:02:56.893 13:31:10 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:02:56.893 13:31:10 -- common/autotest_common.sh@10 -- # set +x 00:02:56.893 ************************************ 00:02:56.893 START TEST setup.sh 00:02:56.893 ************************************ 00:02:56.893 13:31:10 setup.sh -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:02:56.893 * Looking for test storage... 00:02:56.893 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:02:56.893 13:31:10 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:02:56.893 13:31:10 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:56.893 13:31:10 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:02:56.893 13:31:10 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:02:56.893 13:31:10 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:02:56.893 13:31:10 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:02:56.893 ************************************ 00:02:56.893 START TEST acl 00:02:56.893 ************************************ 00:02:56.893 13:31:10 setup.sh.acl -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:02:56.893 * Looking for test storage... 00:02:56.893 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:02:56.893 13:31:11 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1669 -- # local nvme bdf 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:56.893 13:31:11 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:02:56.893 13:31:11 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:02:56.893 13:31:11 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:02:56.893 13:31:11 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:02:56.893 13:31:11 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:02:56.893 13:31:11 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:02:56.893 13:31:11 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:56.893 13:31:11 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:01.105 13:31:15 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:01.105 13:31:15 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:01.105 13:31:15 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:01.105 13:31:15 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:01.105 13:31:15 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:01.105 13:31:15 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:05.311 Hugepages 00:03:05.311 node hugesize free / total 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.311 00:03:05.311 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.0 == *:*:*.* ]] 00:03:05.311 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.1 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.2 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.3 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.4 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.5 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.6 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.7 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:65:00.0 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.0 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.1 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.2 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.3 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.4 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.5 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.6 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.7 == *:*:*.* ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:05.312 13:31:19 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:05.312 13:31:19 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:05.312 13:31:19 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:05.312 13:31:19 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:05.312 ************************************ 00:03:05.312 START TEST denied 00:03:05.312 ************************************ 00:03:05.312 13:31:19 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # denied 00:03:05.312 13:31:19 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:65:00.0' 00:03:05.312 13:31:19 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:05.312 13:31:19 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:65:00.0' 00:03:05.312 13:31:19 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:05.312 13:31:19 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:09.598 0000:65:00.0 (144d a80a): Skipping denied controller at 0000:65:00.0 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:65:00.0 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:65:00.0 ]] 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:65:00.0/driver 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:09.598 13:31:23 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:14.900 00:03:14.900 real 0m9.266s 00:03:14.900 user 0m2.984s 00:03:14.900 sys 0m5.455s 00:03:14.900 13:31:28 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:14.900 13:31:28 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:14.900 ************************************ 00:03:14.900 END TEST denied 00:03:14.900 ************************************ 00:03:14.900 13:31:28 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:14.900 13:31:28 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:14.900 13:31:28 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:14.900 13:31:28 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:14.900 ************************************ 00:03:14.900 START TEST allowed 00:03:14.900 ************************************ 00:03:14.900 13:31:28 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # allowed 00:03:14.900 13:31:28 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:65:00.0 00:03:14.900 13:31:28 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:14.900 13:31:28 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:65:00.0 .*: nvme -> .*' 00:03:14.900 13:31:28 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:14.900 13:31:28 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:20.184 0000:65:00.0 (144d a80a): nvme -> vfio-pci 00:03:20.184 13:31:34 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:20.184 13:31:34 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:20.184 13:31:34 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:20.184 13:31:34 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:20.184 13:31:34 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:24.390 00:03:24.390 real 0m9.815s 00:03:24.390 user 0m2.614s 00:03:24.390 sys 0m5.310s 00:03:24.390 13:31:38 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:24.390 13:31:38 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:24.390 ************************************ 00:03:24.390 END TEST allowed 00:03:24.390 ************************************ 00:03:24.390 00:03:24.390 real 0m27.800s 00:03:24.390 user 0m8.852s 00:03:24.390 sys 0m16.427s 00:03:24.390 13:31:38 setup.sh.acl -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:24.390 13:31:38 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:24.390 ************************************ 00:03:24.390 END TEST acl 00:03:24.390 ************************************ 00:03:24.390 13:31:38 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:24.390 13:31:38 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:24.390 13:31:38 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:24.390 13:31:38 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:24.390 ************************************ 00:03:24.390 START TEST hugepages 00:03:24.390 ************************************ 00:03:24.390 13:31:38 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:24.653 * Looking for test storage... 00:03:24.653 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.653 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 112357116 kB' 'MemAvailable: 112802076 kB' 'Buffers: 2112 kB' 'Cached: 11903000 kB' 'SwapCached: 0 kB' 'Active: 12018216 kB' 'Inactive: 485328 kB' 'Active(anon): 11512844 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 602036 kB' 'Mapped: 176712 kB' 'Shmem: 10914412 kB' 'KReclaimable: 515640 kB' 'Slab: 1124112 kB' 'SReclaimable: 515640 kB' 'SUnreclaim: 608472 kB' 'KernelStack: 26944 kB' 'PageTables: 8704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 72041340 kB' 'Committed_AS: 13095656 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228160 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.654 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.655 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:24.656 13:31:38 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:24.656 13:31:38 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:24.656 13:31:38 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:24.656 13:31:38 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:24.656 ************************************ 00:03:24.656 START TEST default_setup 00:03:24.656 ************************************ 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # default_setup 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:24.656 13:31:39 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:28.865 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:03:28.865 0000:65:00.0 (144d a80a): nvme -> vfio-pci 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114540696 kB' 'MemAvailable: 114985272 kB' 'Buffers: 2112 kB' 'Cached: 11903136 kB' 'SwapCached: 0 kB' 'Active: 12031852 kB' 'Inactive: 485328 kB' 'Active(anon): 11526480 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614732 kB' 'Mapped: 176196 kB' 'Shmem: 10914548 kB' 'KReclaimable: 515256 kB' 'Slab: 1121484 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 606228 kB' 'KernelStack: 26976 kB' 'PageTables: 8960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13108968 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228540 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.132 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.133 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114542752 kB' 'MemAvailable: 114987328 kB' 'Buffers: 2112 kB' 'Cached: 11903136 kB' 'SwapCached: 0 kB' 'Active: 12032008 kB' 'Inactive: 485328 kB' 'Active(anon): 11526636 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614884 kB' 'Mapped: 176196 kB' 'Shmem: 10914548 kB' 'KReclaimable: 515256 kB' 'Slab: 1121476 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 606220 kB' 'KernelStack: 26928 kB' 'PageTables: 8548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13110592 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228540 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.134 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.135 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114541328 kB' 'MemAvailable: 114985904 kB' 'Buffers: 2112 kB' 'Cached: 11903156 kB' 'SwapCached: 0 kB' 'Active: 12031868 kB' 'Inactive: 485328 kB' 'Active(anon): 11526496 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615168 kB' 'Mapped: 176120 kB' 'Shmem: 10914568 kB' 'KReclaimable: 515256 kB' 'Slab: 1120992 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 605736 kB' 'KernelStack: 26944 kB' 'PageTables: 8916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13110616 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228540 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.136 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.137 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:29.138 nr_hugepages=1024 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:29.138 resv_hugepages=0 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:29.138 surplus_hugepages=0 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:29.138 anon_hugepages=0 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114541400 kB' 'MemAvailable: 114985976 kB' 'Buffers: 2112 kB' 'Cached: 11903176 kB' 'SwapCached: 0 kB' 'Active: 12031812 kB' 'Inactive: 485328 kB' 'Active(anon): 11526440 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615048 kB' 'Mapped: 176120 kB' 'Shmem: 10914588 kB' 'KReclaimable: 515256 kB' 'Slab: 1120992 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 605736 kB' 'KernelStack: 26816 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13107788 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228492 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.138 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.139 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 60523488 kB' 'MemUsed: 5088412 kB' 'SwapCached: 0 kB' 'Active: 1595208 kB' 'Inactive: 310716 kB' 'Active(anon): 1347872 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1637620 kB' 'Mapped: 150836 kB' 'AnonPages: 271472 kB' 'Shmem: 1079568 kB' 'KernelStack: 12472 kB' 'PageTables: 3960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 643200 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 329464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.140 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:29.141 node0=1024 expecting 1024 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:29.141 00:03:29.141 real 0m4.499s 00:03:29.141 user 0m1.705s 00:03:29.141 sys 0m2.825s 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:29.141 13:31:43 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:29.141 ************************************ 00:03:29.141 END TEST default_setup 00:03:29.141 ************************************ 00:03:29.141 13:31:43 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:29.141 13:31:43 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:29.141 13:31:43 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:29.141 13:31:43 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:29.141 ************************************ 00:03:29.141 START TEST per_node_1G_alloc 00:03:29.141 ************************************ 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # per_node_1G_alloc 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:29.141 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.142 13:31:43 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:33.356 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:65:00.0 (144d a80a): Already using the vfio-pci driver 00:03:33.356 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114528696 kB' 'MemAvailable: 114973272 kB' 'Buffers: 2112 kB' 'Cached: 11903296 kB' 'SwapCached: 0 kB' 'Active: 12030204 kB' 'Inactive: 485328 kB' 'Active(anon): 11524832 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 612800 kB' 'Mapped: 175132 kB' 'Shmem: 10914708 kB' 'KReclaimable: 515256 kB' 'Slab: 1119816 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604560 kB' 'KernelStack: 26832 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13095744 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228428 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.356 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.357 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114530316 kB' 'MemAvailable: 114974892 kB' 'Buffers: 2112 kB' 'Cached: 11903296 kB' 'SwapCached: 0 kB' 'Active: 12030256 kB' 'Inactive: 485328 kB' 'Active(anon): 11524884 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 613392 kB' 'Mapped: 175132 kB' 'Shmem: 10914708 kB' 'KReclaimable: 515256 kB' 'Slab: 1119780 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604524 kB' 'KernelStack: 26816 kB' 'PageTables: 8360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13095892 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228428 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.358 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.359 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114533048 kB' 'MemAvailable: 114977624 kB' 'Buffers: 2112 kB' 'Cached: 11903336 kB' 'SwapCached: 0 kB' 'Active: 12030628 kB' 'Inactive: 485328 kB' 'Active(anon): 11525256 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 613744 kB' 'Mapped: 175132 kB' 'Shmem: 10914748 kB' 'KReclaimable: 515256 kB' 'Slab: 1119792 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604536 kB' 'KernelStack: 26832 kB' 'PageTables: 8472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13096284 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228428 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.360 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.361 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:33.362 nr_hugepages=1024 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:33.362 resv_hugepages=0 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:33.362 surplus_hugepages=0 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:33.362 anon_hugepages=0 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.362 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114533744 kB' 'MemAvailable: 114978320 kB' 'Buffers: 2112 kB' 'Cached: 11903356 kB' 'SwapCached: 0 kB' 'Active: 12030316 kB' 'Inactive: 485328 kB' 'Active(anon): 11524944 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 613460 kB' 'Mapped: 175132 kB' 'Shmem: 10914768 kB' 'KReclaimable: 515256 kB' 'Slab: 1119792 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604536 kB' 'KernelStack: 26848 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13096308 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228428 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.363 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 61575100 kB' 'MemUsed: 4036800 kB' 'SwapCached: 0 kB' 'Active: 1593840 kB' 'Inactive: 310716 kB' 'Active(anon): 1346504 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1637688 kB' 'Mapped: 150684 kB' 'AnonPages: 270008 kB' 'Shmem: 1079636 kB' 'KernelStack: 12488 kB' 'PageTables: 3888 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 642240 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 328504 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.364 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.365 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65887880 kB' 'MemFree: 52958476 kB' 'MemUsed: 12929404 kB' 'SwapCached: 0 kB' 'Active: 10436496 kB' 'Inactive: 174612 kB' 'Active(anon): 10178460 kB' 'Inactive(anon): 0 kB' 'Active(file): 258036 kB' 'Inactive(file): 174612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10267828 kB' 'Mapped: 24448 kB' 'AnonPages: 343372 kB' 'Shmem: 9835180 kB' 'KernelStack: 14344 kB' 'PageTables: 4544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 201520 kB' 'Slab: 477544 kB' 'SReclaimable: 201520 kB' 'SUnreclaim: 276024 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.366 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:33.367 node0=512 expecting 512 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:33.367 node1=512 expecting 512 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:33.367 00:03:33.367 real 0m4.067s 00:03:33.367 user 0m1.546s 00:03:33.367 sys 0m2.570s 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:33.367 13:31:47 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:33.367 ************************************ 00:03:33.367 END TEST per_node_1G_alloc 00:03:33.367 ************************************ 00:03:33.367 13:31:47 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:33.367 13:31:47 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:33.367 13:31:47 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:33.367 13:31:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:33.367 ************************************ 00:03:33.367 START TEST even_2G_alloc 00:03:33.367 ************************************ 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # even_2G_alloc 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:33.367 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.368 13:31:47 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:37.579 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:37.579 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:37.579 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:37.579 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:37.579 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:65:00.0 (144d a80a): Already using the vfio-pci driver 00:03:37.580 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114541740 kB' 'MemAvailable: 114986316 kB' 'Buffers: 2112 kB' 'Cached: 11903496 kB' 'SwapCached: 0 kB' 'Active: 12031988 kB' 'Inactive: 485328 kB' 'Active(anon): 11526616 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614876 kB' 'Mapped: 175160 kB' 'Shmem: 10914908 kB' 'KReclaimable: 515256 kB' 'Slab: 1119540 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604284 kB' 'KernelStack: 26960 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100204 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228636 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.580 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114543708 kB' 'MemAvailable: 114988284 kB' 'Buffers: 2112 kB' 'Cached: 11903496 kB' 'SwapCached: 0 kB' 'Active: 12031576 kB' 'Inactive: 485328 kB' 'Active(anon): 11526204 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614544 kB' 'Mapped: 175152 kB' 'Shmem: 10914908 kB' 'KReclaimable: 515256 kB' 'Slab: 1119580 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604324 kB' 'KernelStack: 27040 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100220 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228652 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.581 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.582 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114540464 kB' 'MemAvailable: 114985040 kB' 'Buffers: 2112 kB' 'Cached: 11903516 kB' 'SwapCached: 0 kB' 'Active: 12032052 kB' 'Inactive: 485328 kB' 'Active(anon): 11526680 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615004 kB' 'Mapped: 175152 kB' 'Shmem: 10914928 kB' 'KReclaimable: 515256 kB' 'Slab: 1119540 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604284 kB' 'KernelStack: 27072 kB' 'PageTables: 9012 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100244 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228700 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.583 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.584 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:37.585 nr_hugepages=1024 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:37.585 resv_hugepages=0 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:37.585 surplus_hugepages=0 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:37.585 anon_hugepages=0 00:03:37.585 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114540672 kB' 'MemAvailable: 114985248 kB' 'Buffers: 2112 kB' 'Cached: 11903536 kB' 'SwapCached: 0 kB' 'Active: 12031868 kB' 'Inactive: 485328 kB' 'Active(anon): 11526496 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614788 kB' 'Mapped: 175152 kB' 'Shmem: 10914948 kB' 'KReclaimable: 515256 kB' 'Slab: 1119508 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604252 kB' 'KernelStack: 27040 kB' 'PageTables: 9156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100264 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228700 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.586 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:37.587 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 61575596 kB' 'MemUsed: 4036304 kB' 'SwapCached: 0 kB' 'Active: 1593728 kB' 'Inactive: 310716 kB' 'Active(anon): 1346392 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1637752 kB' 'Mapped: 150704 kB' 'AnonPages: 269872 kB' 'Shmem: 1079700 kB' 'KernelStack: 12552 kB' 'PageTables: 4088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 642316 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 328580 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.588 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65887880 kB' 'MemFree: 52965652 kB' 'MemUsed: 12922228 kB' 'SwapCached: 0 kB' 'Active: 10438216 kB' 'Inactive: 174612 kB' 'Active(anon): 10180180 kB' 'Inactive(anon): 0 kB' 'Active(file): 258036 kB' 'Inactive(file): 174612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10267896 kB' 'Mapped: 24952 kB' 'AnonPages: 344992 kB' 'Shmem: 9835248 kB' 'KernelStack: 14504 kB' 'PageTables: 4972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 201520 kB' 'Slab: 477192 kB' 'SReclaimable: 201520 kB' 'SUnreclaim: 275672 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.589 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.590 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:37.591 node0=512 expecting 512 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:37.591 node1=512 expecting 512 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:37.591 00:03:37.591 real 0m4.287s 00:03:37.591 user 0m1.727s 00:03:37.591 sys 0m2.616s 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:37.591 13:31:52 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:37.591 ************************************ 00:03:37.591 END TEST even_2G_alloc 00:03:37.591 ************************************ 00:03:37.851 13:31:52 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:37.851 13:31:52 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:37.852 13:31:52 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:37.852 13:31:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:37.852 ************************************ 00:03:37.852 START TEST odd_alloc 00:03:37.852 ************************************ 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # odd_alloc 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.852 13:31:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:42.066 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:65:00.0 (144d a80a): Already using the vfio-pci driver 00:03:42.066 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.066 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114538468 kB' 'MemAvailable: 114983044 kB' 'Buffers: 2112 kB' 'Cached: 11903696 kB' 'SwapCached: 0 kB' 'Active: 12033356 kB' 'Inactive: 485328 kB' 'Active(anon): 11527984 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615820 kB' 'Mapped: 175492 kB' 'Shmem: 10915108 kB' 'KReclaimable: 515256 kB' 'Slab: 1119988 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604732 kB' 'KernelStack: 26896 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73088892 kB' 'Committed_AS: 13098304 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228476 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.067 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:42.068 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114540356 kB' 'MemAvailable: 114984932 kB' 'Buffers: 2112 kB' 'Cached: 11903696 kB' 'SwapCached: 0 kB' 'Active: 12032424 kB' 'Inactive: 485328 kB' 'Active(anon): 11527052 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614784 kB' 'Mapped: 175248 kB' 'Shmem: 10915108 kB' 'KReclaimable: 515256 kB' 'Slab: 1119924 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604668 kB' 'KernelStack: 26816 kB' 'PageTables: 8368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73088892 kB' 'Committed_AS: 13098320 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228444 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.069 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.070 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.071 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114541356 kB' 'MemAvailable: 114985932 kB' 'Buffers: 2112 kB' 'Cached: 11903716 kB' 'SwapCached: 0 kB' 'Active: 12032216 kB' 'Inactive: 485328 kB' 'Active(anon): 11526844 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615004 kB' 'Mapped: 175172 kB' 'Shmem: 10915128 kB' 'KReclaimable: 515256 kB' 'Slab: 1119912 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604656 kB' 'KernelStack: 26864 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73088892 kB' 'Committed_AS: 13098340 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228444 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.072 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.073 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:42.074 nr_hugepages=1025 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:42.074 resv_hugepages=0 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:42.074 surplus_hugepages=0 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:42.074 anon_hugepages=0 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:42.074 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114541676 kB' 'MemAvailable: 114986252 kB' 'Buffers: 2112 kB' 'Cached: 11903736 kB' 'SwapCached: 0 kB' 'Active: 12032256 kB' 'Inactive: 485328 kB' 'Active(anon): 11526884 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615004 kB' 'Mapped: 175172 kB' 'Shmem: 10915148 kB' 'KReclaimable: 515256 kB' 'Slab: 1119912 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604656 kB' 'KernelStack: 26864 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73088892 kB' 'Committed_AS: 13098364 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228444 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.075 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.076 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 61567372 kB' 'MemUsed: 4044528 kB' 'SwapCached: 0 kB' 'Active: 1592968 kB' 'Inactive: 310716 kB' 'Active(anon): 1345632 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1637920 kB' 'Mapped: 150724 kB' 'AnonPages: 268952 kB' 'Shmem: 1079868 kB' 'KernelStack: 12488 kB' 'PageTables: 3888 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 642304 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 328568 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.077 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.078 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65887880 kB' 'MemFree: 52974160 kB' 'MemUsed: 12913720 kB' 'SwapCached: 0 kB' 'Active: 10438908 kB' 'Inactive: 174612 kB' 'Active(anon): 10180872 kB' 'Inactive(anon): 0 kB' 'Active(file): 258036 kB' 'Inactive(file): 174612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10267964 kB' 'Mapped: 24448 kB' 'AnonPages: 345636 kB' 'Shmem: 9835316 kB' 'KernelStack: 14360 kB' 'PageTables: 4556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 201520 kB' 'Slab: 477608 kB' 'SReclaimable: 201520 kB' 'SUnreclaim: 276088 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.079 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:42.080 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:42.081 node0=512 expecting 513 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:42.081 node1=513 expecting 512 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:42.081 00:03:42.081 real 0m4.267s 00:03:42.081 user 0m1.562s 00:03:42.081 sys 0m2.773s 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:42.081 13:31:56 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:42.081 ************************************ 00:03:42.081 END TEST odd_alloc 00:03:42.081 ************************************ 00:03:42.081 13:31:56 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:42.081 13:31:56 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:42.081 13:31:56 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:42.081 13:31:56 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:42.081 ************************************ 00:03:42.081 START TEST custom_alloc 00:03:42.081 ************************************ 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # custom_alloc 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:42.081 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:42.082 13:31:56 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:46.295 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:65:00.0 (144d a80a): Already using the vfio-pci driver 00:03:46.295 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 113518876 kB' 'MemAvailable: 113963452 kB' 'Buffers: 2112 kB' 'Cached: 11903864 kB' 'SwapCached: 0 kB' 'Active: 12034516 kB' 'Inactive: 485328 kB' 'Active(anon): 11529144 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 616764 kB' 'Mapped: 175292 kB' 'Shmem: 10915276 kB' 'KReclaimable: 515256 kB' 'Slab: 1120124 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604868 kB' 'KernelStack: 27072 kB' 'PageTables: 8928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 72565628 kB' 'Committed_AS: 13102336 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228684 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.295 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.296 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 113519512 kB' 'MemAvailable: 113964088 kB' 'Buffers: 2112 kB' 'Cached: 11903868 kB' 'SwapCached: 0 kB' 'Active: 12033132 kB' 'Inactive: 485328 kB' 'Active(anon): 11527760 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615264 kB' 'Mapped: 175276 kB' 'Shmem: 10915280 kB' 'KReclaimable: 515256 kB' 'Slab: 1120124 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604868 kB' 'KernelStack: 26880 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 72565628 kB' 'Committed_AS: 13102352 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228700 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.297 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.298 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 113516120 kB' 'MemAvailable: 113960696 kB' 'Buffers: 2112 kB' 'Cached: 11903888 kB' 'SwapCached: 0 kB' 'Active: 12034496 kB' 'Inactive: 485328 kB' 'Active(anon): 11529124 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617192 kB' 'Mapped: 175200 kB' 'Shmem: 10915300 kB' 'KReclaimable: 515256 kB' 'Slab: 1120120 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604864 kB' 'KernelStack: 26992 kB' 'PageTables: 8700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 72565628 kB' 'Committed_AS: 13122748 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228732 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.299 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.300 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:46.301 nr_hugepages=1536 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:46.301 resv_hugepages=0 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:46.301 surplus_hugepages=0 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:46.301 anon_hugepages=0 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 113516536 kB' 'MemAvailable: 113961112 kB' 'Buffers: 2112 kB' 'Cached: 11903908 kB' 'SwapCached: 0 kB' 'Active: 12033764 kB' 'Inactive: 485328 kB' 'Active(anon): 11528392 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 616364 kB' 'Mapped: 175704 kB' 'Shmem: 10915320 kB' 'KReclaimable: 515256 kB' 'Slab: 1120120 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604864 kB' 'KernelStack: 26928 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 72565628 kB' 'Committed_AS: 13101404 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228652 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.301 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.302 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 61552296 kB' 'MemUsed: 4059604 kB' 'SwapCached: 0 kB' 'Active: 1598908 kB' 'Inactive: 310716 kB' 'Active(anon): 1351572 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1637984 kB' 'Mapped: 150752 kB' 'AnonPages: 274868 kB' 'Shmem: 1079932 kB' 'KernelStack: 12472 kB' 'PageTables: 3476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 642456 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 328720 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.303 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65887880 kB' 'MemFree: 51961644 kB' 'MemUsed: 13926236 kB' 'SwapCached: 0 kB' 'Active: 10437872 kB' 'Inactive: 174612 kB' 'Active(anon): 10179836 kB' 'Inactive(anon): 0 kB' 'Active(file): 258036 kB' 'Inactive(file): 174612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10268080 kB' 'Mapped: 24952 kB' 'AnonPages: 344468 kB' 'Shmem: 9835432 kB' 'KernelStack: 14248 kB' 'PageTables: 4368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 201520 kB' 'Slab: 477664 kB' 'SReclaimable: 201520 kB' 'SUnreclaim: 276144 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:46.304 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.305 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.566 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:46.567 node0=512 expecting 512 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:46.567 node1=1024 expecting 1024 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:46.567 00:03:46.567 real 0m4.336s 00:03:46.567 user 0m1.723s 00:03:46.567 sys 0m2.680s 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:46.567 13:32:00 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:46.567 ************************************ 00:03:46.567 END TEST custom_alloc 00:03:46.567 ************************************ 00:03:46.567 13:32:00 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:46.567 13:32:00 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:46.567 13:32:00 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:46.567 13:32:00 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:46.567 ************************************ 00:03:46.567 START TEST no_shrink_alloc 00:03:46.567 ************************************ 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # no_shrink_alloc 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:46.567 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.568 13:32:00 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:50.775 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:65:00.0 (144d a80a): Already using the vfio-pci driver 00:03:50.775 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114559304 kB' 'MemAvailable: 115003880 kB' 'Buffers: 2112 kB' 'Cached: 11904052 kB' 'SwapCached: 0 kB' 'Active: 12034036 kB' 'Inactive: 485328 kB' 'Active(anon): 11528664 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 616380 kB' 'Mapped: 175216 kB' 'Shmem: 10915464 kB' 'KReclaimable: 515256 kB' 'Slab: 1119820 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604564 kB' 'KernelStack: 26816 kB' 'PageTables: 8340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100472 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228572 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:50.775 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.776 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114562868 kB' 'MemAvailable: 115007444 kB' 'Buffers: 2112 kB' 'Cached: 11904056 kB' 'SwapCached: 0 kB' 'Active: 12033868 kB' 'Inactive: 485328 kB' 'Active(anon): 11528496 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 616292 kB' 'Mapped: 175216 kB' 'Shmem: 10915468 kB' 'KReclaimable: 515256 kB' 'Slab: 1119820 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604564 kB' 'KernelStack: 26864 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100492 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228572 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.777 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.778 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.779 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114563096 kB' 'MemAvailable: 115007672 kB' 'Buffers: 2112 kB' 'Cached: 11904072 kB' 'SwapCached: 0 kB' 'Active: 12033896 kB' 'Inactive: 485328 kB' 'Active(anon): 11528524 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 616264 kB' 'Mapped: 175216 kB' 'Shmem: 10915484 kB' 'KReclaimable: 515256 kB' 'Slab: 1119884 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604628 kB' 'KernelStack: 26864 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100512 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228572 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:50.780 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.780 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.781 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:50.782 nr_hugepages=1024 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:50.782 resv_hugepages=0 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:50.782 surplus_hugepages=0 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:50.782 anon_hugepages=0 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114563112 kB' 'MemAvailable: 115007688 kB' 'Buffers: 2112 kB' 'Cached: 11904116 kB' 'SwapCached: 0 kB' 'Active: 12033540 kB' 'Inactive: 485328 kB' 'Active(anon): 11528168 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615864 kB' 'Mapped: 175216 kB' 'Shmem: 10915528 kB' 'KReclaimable: 515256 kB' 'Slab: 1119884 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 604628 kB' 'KernelStack: 26848 kB' 'PageTables: 8444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13100536 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228572 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.782 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.783 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 60498544 kB' 'MemUsed: 5113356 kB' 'SwapCached: 0 kB' 'Active: 1596992 kB' 'Inactive: 310716 kB' 'Active(anon): 1349656 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1638052 kB' 'Mapped: 150768 kB' 'AnonPages: 272788 kB' 'Shmem: 1080000 kB' 'KernelStack: 12488 kB' 'PageTables: 3892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 642624 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 328888 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.784 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.785 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:50.786 node0=1024 expecting 1024 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.786 13:32:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:54.999 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:65:00.0 (144d a80a): Already using the vfio-pci driver 00:03:54.999 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:03:54.999 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:03:55.000 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:03:55.000 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:03:55.000 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114577892 kB' 'MemAvailable: 115022468 kB' 'Buffers: 2112 kB' 'Cached: 11904216 kB' 'SwapCached: 0 kB' 'Active: 12035684 kB' 'Inactive: 485328 kB' 'Active(anon): 11530312 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 618064 kB' 'Mapped: 175252 kB' 'Shmem: 10915628 kB' 'KReclaimable: 515256 kB' 'Slab: 1120268 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 605012 kB' 'KernelStack: 26944 kB' 'PageTables: 9020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13103984 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228588 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.000 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.001 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114579892 kB' 'MemAvailable: 115024468 kB' 'Buffers: 2112 kB' 'Cached: 11904220 kB' 'SwapCached: 0 kB' 'Active: 12035476 kB' 'Inactive: 485328 kB' 'Active(anon): 11530104 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617888 kB' 'Mapped: 175236 kB' 'Shmem: 10915632 kB' 'KReclaimable: 515256 kB' 'Slab: 1120288 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 605032 kB' 'KernelStack: 27056 kB' 'PageTables: 9036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13104008 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228588 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.002 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.003 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114579796 kB' 'MemAvailable: 115024372 kB' 'Buffers: 2112 kB' 'Cached: 11904236 kB' 'SwapCached: 0 kB' 'Active: 12035552 kB' 'Inactive: 485328 kB' 'Active(anon): 11530180 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617836 kB' 'Mapped: 175236 kB' 'Shmem: 10915648 kB' 'KReclaimable: 515256 kB' 'Slab: 1120288 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 605032 kB' 'KernelStack: 27024 kB' 'PageTables: 9168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13102308 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228620 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.004 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.005 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:55.006 nr_hugepages=1024 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:55.006 resv_hugepages=0 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:55.006 surplus_hugepages=0 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:55.006 anon_hugepages=0 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.006 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 131499780 kB' 'MemFree: 114577372 kB' 'MemAvailable: 115021948 kB' 'Buffers: 2112 kB' 'Cached: 11904260 kB' 'SwapCached: 0 kB' 'Active: 12035132 kB' 'Inactive: 485328 kB' 'Active(anon): 11529760 kB' 'Inactive(anon): 0 kB' 'Active(file): 505372 kB' 'Inactive(file): 485328 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617384 kB' 'Mapped: 175228 kB' 'Shmem: 10915672 kB' 'KReclaimable: 515256 kB' 'Slab: 1120288 kB' 'SReclaimable: 515256 kB' 'SUnreclaim: 605032 kB' 'KernelStack: 27120 kB' 'PageTables: 8800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 73089916 kB' 'Committed_AS: 13104184 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 228620 kB' 'VmallocChunk: 0 kB' 'Percpu: 126720 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1367848 kB' 'DirectMap2M: 46546944 kB' 'DirectMap1G: 88080384 kB' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.007 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:55.008 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65611900 kB' 'MemFree: 60511588 kB' 'MemUsed: 5100312 kB' 'SwapCached: 0 kB' 'Active: 1595080 kB' 'Inactive: 310716 kB' 'Active(anon): 1347744 kB' 'Inactive(anon): 0 kB' 'Active(file): 247336 kB' 'Inactive(file): 310716 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1638124 kB' 'Mapped: 150780 kB' 'AnonPages: 270904 kB' 'Shmem: 1080072 kB' 'KernelStack: 12536 kB' 'PageTables: 4060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 313736 kB' 'Slab: 642952 kB' 'SReclaimable: 313736 kB' 'SUnreclaim: 329216 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.009 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:55.010 node0=1024 expecting 1024 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:55.010 00:03:55.010 real 0m8.521s 00:03:55.010 user 0m3.333s 00:03:55.010 sys 0m5.309s 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:55.010 13:32:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:55.010 ************************************ 00:03:55.010 END TEST no_shrink_alloc 00:03:55.010 ************************************ 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:55.010 13:32:09 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:55.010 00:03:55.010 real 0m30.609s 00:03:55.010 user 0m11.839s 00:03:55.010 sys 0m19.200s 00:03:55.010 13:32:09 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:55.010 13:32:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:55.010 ************************************ 00:03:55.010 END TEST hugepages 00:03:55.010 ************************************ 00:03:55.271 13:32:09 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:55.271 13:32:09 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:55.271 13:32:09 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:55.271 13:32:09 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:55.271 ************************************ 00:03:55.271 START TEST driver 00:03:55.271 ************************************ 00:03:55.271 13:32:09 setup.sh.driver -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:55.271 * Looking for test storage... 00:03:55.271 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:55.271 13:32:09 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:03:55.271 13:32:09 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:55.271 13:32:09 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:00.706 13:32:15 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:00.706 13:32:15 setup.sh.driver -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:00.706 13:32:15 setup.sh.driver -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:00.706 13:32:15 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:00.706 ************************************ 00:04:00.706 START TEST guess_driver 00:04:00.706 ************************************ 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # guess_driver 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 364 > 0 )) 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:00.706 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:00.966 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:00.966 Looking for driver=vfio-pci 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.966 13:32:15 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:05.174 13:32:18 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.174 13:32:18 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.174 13:32:18 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.174 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.174 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.174 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.174 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.174 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.174 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:05.175 13:32:19 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:11.760 00:04:11.760 real 0m9.891s 00:04:11.760 user 0m3.241s 00:04:11.760 sys 0m5.760s 00:04:11.760 13:32:25 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:11.760 13:32:25 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:11.760 ************************************ 00:04:11.760 END TEST guess_driver 00:04:11.760 ************************************ 00:04:11.760 00:04:11.760 real 0m15.568s 00:04:11.760 user 0m4.915s 00:04:11.760 sys 0m8.852s 00:04:11.760 13:32:25 setup.sh.driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:11.760 13:32:25 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:11.760 ************************************ 00:04:11.760 END TEST driver 00:04:11.760 ************************************ 00:04:11.760 13:32:25 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:11.760 13:32:25 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:11.760 13:32:25 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:11.760 13:32:25 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:11.760 ************************************ 00:04:11.760 START TEST devices 00:04:11.760 ************************************ 00:04:11.760 13:32:25 setup.sh.devices -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:11.760 * Looking for test storage... 00:04:11.760 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:11.760 13:32:25 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:11.760 13:32:25 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:11.760 13:32:25 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:11.760 13:32:25 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1669 -- # local nvme bdf 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:65:00.0 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:15.971 13:32:29 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:15.971 No valid GPT data, bailing 00:04:15.971 13:32:29 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:15.971 13:32:29 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:15.971 13:32:29 setup.sh.devices -- setup/common.sh@80 -- # echo 1920383410176 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@204 -- # (( 1920383410176 >= min_disk_size )) 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:65:00.0 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:15.971 13:32:29 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:15.971 13:32:29 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:15.971 ************************************ 00:04:15.971 START TEST nvme_mount 00:04:15.971 ************************************ 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # nvme_mount 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:15.971 13:32:29 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:16.543 Creating new GPT entries in memory. 00:04:16.543 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:16.543 other utilities. 00:04:16.543 13:32:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:16.543 13:32:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:16.543 13:32:30 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:16.543 13:32:30 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:16.543 13:32:30 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:17.487 Creating new GPT entries in memory. 00:04:17.487 The operation has completed successfully. 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1413477 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:65:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.487 13:32:31 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.721 13:32:35 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.721 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:21.721 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:21.721 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:21.721 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:21.721 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:21.983 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:21.983 /dev/nvme0n1: 8 bytes were erased at offset 0x1bf1fc55e00 (gpt): 45 46 49 20 50 41 52 54 00:04:21.983 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:21.983 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:65:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:21.983 13:32:36 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.190 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:39 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:65:00.0 data@nvme0n1 '' '' 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.191 13:32:40 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:30.394 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.394 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:30.395 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:30.395 00:04:30.395 real 0m14.875s 00:04:30.395 user 0m4.694s 00:04:30.395 sys 0m8.063s 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:30.395 13:32:44 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:30.395 ************************************ 00:04:30.395 END TEST nvme_mount 00:04:30.395 ************************************ 00:04:30.395 13:32:44 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:30.395 13:32:44 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:30.395 13:32:44 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:30.395 13:32:44 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:30.395 ************************************ 00:04:30.395 START TEST dm_mount 00:04:30.395 ************************************ 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # dm_mount 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:30.395 13:32:44 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:31.338 Creating new GPT entries in memory. 00:04:31.338 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:31.338 other utilities. 00:04:31.338 13:32:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:31.338 13:32:45 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:31.338 13:32:45 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:31.338 13:32:45 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:31.338 13:32:45 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:32.281 Creating new GPT entries in memory. 00:04:32.281 The operation has completed successfully. 00:04:32.281 13:32:46 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:32.281 13:32:46 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:32.281 13:32:46 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:32.281 13:32:46 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:32.281 13:32:46 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:33.668 The operation has completed successfully. 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1419233 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:65:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.668 13:32:47 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:36.973 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:65:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.545 13:32:51 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:41.753 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:41.754 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:41.754 00:04:41.754 real 0m11.272s 00:04:41.754 user 0m3.018s 00:04:41.754 sys 0m5.253s 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:41.754 13:32:55 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:41.754 ************************************ 00:04:41.754 END TEST dm_mount 00:04:41.754 ************************************ 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:41.754 13:32:56 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:42.015 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:42.015 /dev/nvme0n1: 8 bytes were erased at offset 0x1bf1fc55e00 (gpt): 45 46 49 20 50 41 52 54 00:04:42.015 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:42.015 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:42.015 13:32:56 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:42.015 00:04:42.015 real 0m31.136s 00:04:42.015 user 0m9.521s 00:04:42.015 sys 0m16.381s 00:04:42.015 13:32:56 setup.sh.devices -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:42.015 13:32:56 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:42.015 ************************************ 00:04:42.015 END TEST devices 00:04:42.015 ************************************ 00:04:42.015 00:04:42.015 real 1m45.542s 00:04:42.015 user 0m35.278s 00:04:42.015 sys 1m1.166s 00:04:42.015 13:32:56 setup.sh -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:42.015 13:32:56 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:42.015 ************************************ 00:04:42.015 END TEST setup.sh 00:04:42.015 ************************************ 00:04:42.015 13:32:56 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:46.220 Hugepages 00:04:46.220 node hugesize free / total 00:04:46.220 node0 1048576kB 0 / 0 00:04:46.220 node0 2048kB 1024 / 1024 00:04:46.220 node1 1048576kB 0 / 0 00:04:46.220 node1 2048kB 1024 / 1024 00:04:46.220 00:04:46.220 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:46.220 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:04:46.220 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:04:46.220 NVMe 0000:65:00.0 144d a80a 0 nvme nvme0 nvme0n1 00:04:46.220 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:04:46.220 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:04:46.220 13:33:00 -- spdk/autotest.sh@130 -- # uname -s 00:04:46.220 13:33:00 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:46.220 13:33:00 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:46.220 13:33:00 -- common/autotest_common.sh@1530 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:50.431 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:50.431 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:51.814 0000:65:00.0 (144d a80a): nvme -> vfio-pci 00:04:52.075 13:33:06 -- common/autotest_common.sh@1531 -- # sleep 1 00:04:53.018 13:33:07 -- common/autotest_common.sh@1532 -- # bdfs=() 00:04:53.018 13:33:07 -- common/autotest_common.sh@1532 -- # local bdfs 00:04:53.018 13:33:07 -- common/autotest_common.sh@1533 -- # bdfs=($(get_nvme_bdfs)) 00:04:53.018 13:33:07 -- common/autotest_common.sh@1533 -- # get_nvme_bdfs 00:04:53.018 13:33:07 -- common/autotest_common.sh@1512 -- # bdfs=() 00:04:53.018 13:33:07 -- common/autotest_common.sh@1512 -- # local bdfs 00:04:53.018 13:33:07 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:53.018 13:33:07 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:53.018 13:33:07 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:04:53.279 13:33:07 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:04:53.279 13:33:07 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:65:00.0 00:04:53.279 13:33:07 -- common/autotest_common.sh@1535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:56.733 Waiting for block devices as requested 00:04:56.733 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:04:56.733 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:04:56.995 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:04:56.995 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:04:56.995 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:04:57.256 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:04:57.256 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:04:57.256 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:04:57.517 0000:65:00.0 (144d a80a): vfio-pci -> nvme 00:04:57.517 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:04:57.777 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:04:57.777 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:04:57.777 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:04:58.038 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:04:58.038 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:04:58.038 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:04:58.298 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:04:58.559 13:33:12 -- common/autotest_common.sh@1537 -- # for bdf in "${bdfs[@]}" 00:04:58.559 13:33:12 -- common/autotest_common.sh@1538 -- # get_nvme_ctrlr_from_bdf 0000:65:00.0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1501 -- # readlink -f /sys/class/nvme/nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1501 -- # grep 0000:65:00.0/nvme/nvme 00:04:58.559 13:33:12 -- common/autotest_common.sh@1501 -- # bdf_sysfs_path=/sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1502 -- # [[ -z /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 ]] 00:04:58.559 13:33:12 -- common/autotest_common.sh@1506 -- # basename /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1506 -- # printf '%s\n' nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1538 -- # nvme_ctrlr=/dev/nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1539 -- # [[ -z /dev/nvme0 ]] 00:04:58.559 13:33:12 -- common/autotest_common.sh@1544 -- # nvme id-ctrl /dev/nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1544 -- # grep oacs 00:04:58.559 13:33:12 -- common/autotest_common.sh@1544 -- # cut -d: -f2 00:04:58.559 13:33:12 -- common/autotest_common.sh@1544 -- # oacs=' 0x5f' 00:04:58.559 13:33:12 -- common/autotest_common.sh@1545 -- # oacs_ns_manage=8 00:04:58.559 13:33:12 -- common/autotest_common.sh@1547 -- # [[ 8 -ne 0 ]] 00:04:58.559 13:33:12 -- common/autotest_common.sh@1553 -- # nvme id-ctrl /dev/nvme0 00:04:58.559 13:33:12 -- common/autotest_common.sh@1553 -- # grep unvmcap 00:04:58.559 13:33:12 -- common/autotest_common.sh@1553 -- # cut -d: -f2 00:04:58.559 13:33:12 -- common/autotest_common.sh@1553 -- # unvmcap=' 0' 00:04:58.559 13:33:12 -- common/autotest_common.sh@1554 -- # [[ 0 -eq 0 ]] 00:04:58.559 13:33:12 -- common/autotest_common.sh@1556 -- # continue 00:04:58.559 13:33:12 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:04:58.559 13:33:12 -- common/autotest_common.sh@729 -- # xtrace_disable 00:04:58.559 13:33:12 -- common/autotest_common.sh@10 -- # set +x 00:04:58.559 13:33:12 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:04:58.559 13:33:12 -- common/autotest_common.sh@723 -- # xtrace_disable 00:04:58.559 13:33:12 -- common/autotest_common.sh@10 -- # set +x 00:04:58.559 13:33:12 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:02.763 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:02.763 0000:65:00.0 (144d a80a): nvme -> vfio-pci 00:05:02.763 13:33:16 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:02.763 13:33:16 -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:02.763 13:33:16 -- common/autotest_common.sh@10 -- # set +x 00:05:02.763 13:33:17 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:02.763 13:33:17 -- common/autotest_common.sh@1590 -- # mapfile -t bdfs 00:05:02.763 13:33:17 -- common/autotest_common.sh@1590 -- # get_nvme_bdfs_by_id 0x0a54 00:05:02.763 13:33:17 -- common/autotest_common.sh@1576 -- # bdfs=() 00:05:02.763 13:33:17 -- common/autotest_common.sh@1576 -- # local bdfs 00:05:02.763 13:33:17 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs 00:05:02.763 13:33:17 -- common/autotest_common.sh@1512 -- # bdfs=() 00:05:02.763 13:33:17 -- common/autotest_common.sh@1512 -- # local bdfs 00:05:02.763 13:33:17 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:02.763 13:33:17 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:02.763 13:33:17 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:05:02.763 13:33:17 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:05:02.763 13:33:17 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:65:00.0 00:05:02.763 13:33:17 -- common/autotest_common.sh@1578 -- # for bdf in $(get_nvme_bdfs) 00:05:02.763 13:33:17 -- common/autotest_common.sh@1579 -- # cat /sys/bus/pci/devices/0000:65:00.0/device 00:05:02.763 13:33:17 -- common/autotest_common.sh@1579 -- # device=0xa80a 00:05:02.763 13:33:17 -- common/autotest_common.sh@1580 -- # [[ 0xa80a == \0\x\0\a\5\4 ]] 00:05:02.763 13:33:17 -- common/autotest_common.sh@1585 -- # printf '%s\n' 00:05:02.763 13:33:17 -- common/autotest_common.sh@1591 -- # [[ -z '' ]] 00:05:02.763 13:33:17 -- common/autotest_common.sh@1592 -- # return 0 00:05:02.763 13:33:17 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:02.763 13:33:17 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:02.763 13:33:17 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:02.763 13:33:17 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:02.763 13:33:17 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:03.333 Restarting all devices. 00:05:07.535 lstat() error: No such file or directory 00:05:07.535 QAT Error: No GENERAL section found 00:05:07.535 Failed to configure qat_dev0 00:05:07.535 lstat() error: No such file or directory 00:05:07.535 QAT Error: No GENERAL section found 00:05:07.535 Failed to configure qat_dev1 00:05:07.535 lstat() error: No such file or directory 00:05:07.535 QAT Error: No GENERAL section found 00:05:07.535 Failed to configure qat_dev2 00:05:07.535 enable sriov 00:05:07.535 Checking status of all devices. 00:05:07.535 There is 3 QAT acceleration device(s) in the system: 00:05:07.535 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:4d:00.0, #accel: 5 #engines: 10 state: down 00:05:07.535 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:4f:00.0, #accel: 5 #engines: 10 state: down 00:05:07.535 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:51:00.0, #accel: 5 #engines: 10 state: down 00:05:07.535 0000:4d:00.0 set to 16 VFs 00:05:08.475 0000:4f:00.0 set to 16 VFs 00:05:09.418 0000:51:00.0 set to 16 VFs 00:05:10.802 Properly configured the qat device with driver uio_pci_generic. 00:05:10.802 13:33:24 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:10.803 13:33:24 -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:10.803 13:33:24 -- common/autotest_common.sh@10 -- # set +x 00:05:10.803 13:33:24 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:10.803 13:33:24 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:10.803 13:33:24 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:10.803 13:33:24 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:10.803 13:33:24 -- common/autotest_common.sh@10 -- # set +x 00:05:10.803 ************************************ 00:05:10.803 START TEST env 00:05:10.803 ************************************ 00:05:10.803 13:33:25 env -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:10.803 * Looking for test storage... 00:05:10.803 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:10.803 13:33:25 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:10.803 13:33:25 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:10.803 13:33:25 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:10.803 13:33:25 env -- common/autotest_common.sh@10 -- # set +x 00:05:10.803 ************************************ 00:05:10.803 START TEST env_memory 00:05:10.803 ************************************ 00:05:10.803 13:33:25 env.env_memory -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:10.803 00:05:10.803 00:05:10.803 CUnit - A unit testing framework for C - Version 2.1-3 00:05:10.803 http://cunit.sourceforge.net/ 00:05:10.803 00:05:10.803 00:05:10.803 Suite: memory 00:05:10.803 Test: alloc and free memory map ...[2024-06-10 13:33:25.227571] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:10.803 passed 00:05:10.803 Test: mem map translation ...[2024-06-10 13:33:25.253139] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:10.803 [2024-06-10 13:33:25.253173] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:10.803 [2024-06-10 13:33:25.253220] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:10.803 [2024-06-10 13:33:25.253227] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:11.066 passed 00:05:11.066 Test: mem map registration ...[2024-06-10 13:33:25.308518] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:11.066 [2024-06-10 13:33:25.308540] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:11.066 passed 00:05:11.066 Test: mem map adjacent registrations ...passed 00:05:11.066 00:05:11.066 Run Summary: Type Total Ran Passed Failed Inactive 00:05:11.066 suites 1 1 n/a 0 0 00:05:11.066 tests 4 4 4 0 0 00:05:11.066 asserts 152 152 152 0 n/a 00:05:11.066 00:05:11.066 Elapsed time = 0.194 seconds 00:05:11.066 00:05:11.066 real 0m0.208s 00:05:11.066 user 0m0.202s 00:05:11.066 sys 0m0.005s 00:05:11.066 13:33:25 env.env_memory -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:11.066 13:33:25 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:11.066 ************************************ 00:05:11.066 END TEST env_memory 00:05:11.066 ************************************ 00:05:11.066 13:33:25 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:11.066 13:33:25 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:11.066 13:33:25 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:11.066 13:33:25 env -- common/autotest_common.sh@10 -- # set +x 00:05:11.066 ************************************ 00:05:11.066 START TEST env_vtophys 00:05:11.066 ************************************ 00:05:11.066 13:33:25 env.env_vtophys -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:11.066 EAL: lib.eal log level changed from notice to debug 00:05:11.066 EAL: Detected lcore 0 as core 0 on socket 0 00:05:11.066 EAL: Detected lcore 1 as core 1 on socket 0 00:05:11.066 EAL: Detected lcore 2 as core 2 on socket 0 00:05:11.066 EAL: Detected lcore 3 as core 3 on socket 0 00:05:11.066 EAL: Detected lcore 4 as core 4 on socket 0 00:05:11.066 EAL: Detected lcore 5 as core 5 on socket 0 00:05:11.066 EAL: Detected lcore 6 as core 6 on socket 0 00:05:11.066 EAL: Detected lcore 7 as core 7 on socket 0 00:05:11.066 EAL: Detected lcore 8 as core 8 on socket 0 00:05:11.066 EAL: Detected lcore 9 as core 9 on socket 0 00:05:11.066 EAL: Detected lcore 10 as core 10 on socket 0 00:05:11.066 EAL: Detected lcore 11 as core 11 on socket 0 00:05:11.066 EAL: Detected lcore 12 as core 12 on socket 0 00:05:11.066 EAL: Detected lcore 13 as core 13 on socket 0 00:05:11.066 EAL: Detected lcore 14 as core 14 on socket 0 00:05:11.066 EAL: Detected lcore 15 as core 15 on socket 0 00:05:11.066 EAL: Detected lcore 16 as core 16 on socket 0 00:05:11.066 EAL: Detected lcore 17 as core 17 on socket 0 00:05:11.066 EAL: Detected lcore 18 as core 18 on socket 0 00:05:11.066 EAL: Detected lcore 19 as core 19 on socket 0 00:05:11.066 EAL: Detected lcore 20 as core 20 on socket 0 00:05:11.066 EAL: Detected lcore 21 as core 21 on socket 0 00:05:11.066 EAL: Detected lcore 22 as core 22 on socket 0 00:05:11.066 EAL: Detected lcore 23 as core 23 on socket 0 00:05:11.066 EAL: Detected lcore 24 as core 24 on socket 0 00:05:11.066 EAL: Detected lcore 25 as core 25 on socket 0 00:05:11.066 EAL: Detected lcore 26 as core 26 on socket 0 00:05:11.066 EAL: Detected lcore 27 as core 27 on socket 0 00:05:11.066 EAL: Detected lcore 28 as core 28 on socket 0 00:05:11.066 EAL: Detected lcore 29 as core 29 on socket 0 00:05:11.066 EAL: Detected lcore 30 as core 30 on socket 0 00:05:11.066 EAL: Detected lcore 31 as core 31 on socket 0 00:05:11.066 EAL: Detected lcore 32 as core 32 on socket 0 00:05:11.066 EAL: Detected lcore 33 as core 33 on socket 0 00:05:11.066 EAL: Detected lcore 34 as core 34 on socket 0 00:05:11.066 EAL: Detected lcore 35 as core 35 on socket 0 00:05:11.066 EAL: Detected lcore 36 as core 0 on socket 1 00:05:11.066 EAL: Detected lcore 37 as core 1 on socket 1 00:05:11.066 EAL: Detected lcore 38 as core 2 on socket 1 00:05:11.066 EAL: Detected lcore 39 as core 3 on socket 1 00:05:11.066 EAL: Detected lcore 40 as core 4 on socket 1 00:05:11.066 EAL: Detected lcore 41 as core 5 on socket 1 00:05:11.066 EAL: Detected lcore 42 as core 6 on socket 1 00:05:11.066 EAL: Detected lcore 43 as core 7 on socket 1 00:05:11.066 EAL: Detected lcore 44 as core 8 on socket 1 00:05:11.066 EAL: Detected lcore 45 as core 9 on socket 1 00:05:11.066 EAL: Detected lcore 46 as core 10 on socket 1 00:05:11.066 EAL: Detected lcore 47 as core 11 on socket 1 00:05:11.066 EAL: Detected lcore 48 as core 12 on socket 1 00:05:11.066 EAL: Detected lcore 49 as core 13 on socket 1 00:05:11.066 EAL: Detected lcore 50 as core 14 on socket 1 00:05:11.066 EAL: Detected lcore 51 as core 15 on socket 1 00:05:11.066 EAL: Detected lcore 52 as core 16 on socket 1 00:05:11.066 EAL: Detected lcore 53 as core 17 on socket 1 00:05:11.066 EAL: Detected lcore 54 as core 18 on socket 1 00:05:11.066 EAL: Detected lcore 55 as core 19 on socket 1 00:05:11.066 EAL: Detected lcore 56 as core 20 on socket 1 00:05:11.066 EAL: Detected lcore 57 as core 21 on socket 1 00:05:11.066 EAL: Detected lcore 58 as core 22 on socket 1 00:05:11.066 EAL: Detected lcore 59 as core 23 on socket 1 00:05:11.067 EAL: Detected lcore 60 as core 24 on socket 1 00:05:11.067 EAL: Detected lcore 61 as core 25 on socket 1 00:05:11.067 EAL: Detected lcore 62 as core 26 on socket 1 00:05:11.067 EAL: Detected lcore 63 as core 27 on socket 1 00:05:11.067 EAL: Detected lcore 64 as core 28 on socket 1 00:05:11.067 EAL: Detected lcore 65 as core 29 on socket 1 00:05:11.067 EAL: Detected lcore 66 as core 30 on socket 1 00:05:11.067 EAL: Detected lcore 67 as core 31 on socket 1 00:05:11.067 EAL: Detected lcore 68 as core 32 on socket 1 00:05:11.067 EAL: Detected lcore 69 as core 33 on socket 1 00:05:11.067 EAL: Detected lcore 70 as core 34 on socket 1 00:05:11.067 EAL: Detected lcore 71 as core 35 on socket 1 00:05:11.067 EAL: Detected lcore 72 as core 0 on socket 0 00:05:11.067 EAL: Detected lcore 73 as core 1 on socket 0 00:05:11.067 EAL: Detected lcore 74 as core 2 on socket 0 00:05:11.067 EAL: Detected lcore 75 as core 3 on socket 0 00:05:11.067 EAL: Detected lcore 76 as core 4 on socket 0 00:05:11.067 EAL: Detected lcore 77 as core 5 on socket 0 00:05:11.067 EAL: Detected lcore 78 as core 6 on socket 0 00:05:11.067 EAL: Detected lcore 79 as core 7 on socket 0 00:05:11.067 EAL: Detected lcore 80 as core 8 on socket 0 00:05:11.067 EAL: Detected lcore 81 as core 9 on socket 0 00:05:11.067 EAL: Detected lcore 82 as core 10 on socket 0 00:05:11.067 EAL: Detected lcore 83 as core 11 on socket 0 00:05:11.067 EAL: Detected lcore 84 as core 12 on socket 0 00:05:11.067 EAL: Detected lcore 85 as core 13 on socket 0 00:05:11.067 EAL: Detected lcore 86 as core 14 on socket 0 00:05:11.067 EAL: Detected lcore 87 as core 15 on socket 0 00:05:11.067 EAL: Detected lcore 88 as core 16 on socket 0 00:05:11.067 EAL: Detected lcore 89 as core 17 on socket 0 00:05:11.067 EAL: Detected lcore 90 as core 18 on socket 0 00:05:11.067 EAL: Detected lcore 91 as core 19 on socket 0 00:05:11.067 EAL: Detected lcore 92 as core 20 on socket 0 00:05:11.067 EAL: Detected lcore 93 as core 21 on socket 0 00:05:11.067 EAL: Detected lcore 94 as core 22 on socket 0 00:05:11.067 EAL: Detected lcore 95 as core 23 on socket 0 00:05:11.067 EAL: Detected lcore 96 as core 24 on socket 0 00:05:11.067 EAL: Detected lcore 97 as core 25 on socket 0 00:05:11.067 EAL: Detected lcore 98 as core 26 on socket 0 00:05:11.067 EAL: Detected lcore 99 as core 27 on socket 0 00:05:11.067 EAL: Detected lcore 100 as core 28 on socket 0 00:05:11.067 EAL: Detected lcore 101 as core 29 on socket 0 00:05:11.067 EAL: Detected lcore 102 as core 30 on socket 0 00:05:11.067 EAL: Detected lcore 103 as core 31 on socket 0 00:05:11.067 EAL: Detected lcore 104 as core 32 on socket 0 00:05:11.067 EAL: Detected lcore 105 as core 33 on socket 0 00:05:11.067 EAL: Detected lcore 106 as core 34 on socket 0 00:05:11.067 EAL: Detected lcore 107 as core 35 on socket 0 00:05:11.067 EAL: Detected lcore 108 as core 0 on socket 1 00:05:11.067 EAL: Detected lcore 109 as core 1 on socket 1 00:05:11.067 EAL: Detected lcore 110 as core 2 on socket 1 00:05:11.067 EAL: Detected lcore 111 as core 3 on socket 1 00:05:11.067 EAL: Detected lcore 112 as core 4 on socket 1 00:05:11.067 EAL: Detected lcore 113 as core 5 on socket 1 00:05:11.067 EAL: Detected lcore 114 as core 6 on socket 1 00:05:11.067 EAL: Detected lcore 115 as core 7 on socket 1 00:05:11.067 EAL: Detected lcore 116 as core 8 on socket 1 00:05:11.067 EAL: Detected lcore 117 as core 9 on socket 1 00:05:11.067 EAL: Detected lcore 118 as core 10 on socket 1 00:05:11.067 EAL: Detected lcore 119 as core 11 on socket 1 00:05:11.067 EAL: Detected lcore 120 as core 12 on socket 1 00:05:11.067 EAL: Detected lcore 121 as core 13 on socket 1 00:05:11.067 EAL: Detected lcore 122 as core 14 on socket 1 00:05:11.067 EAL: Detected lcore 123 as core 15 on socket 1 00:05:11.067 EAL: Detected lcore 124 as core 16 on socket 1 00:05:11.067 EAL: Detected lcore 125 as core 17 on socket 1 00:05:11.067 EAL: Detected lcore 126 as core 18 on socket 1 00:05:11.067 EAL: Detected lcore 127 as core 19 on socket 1 00:05:11.067 EAL: Skipped lcore 128 as core 20 on socket 1 00:05:11.067 EAL: Skipped lcore 129 as core 21 on socket 1 00:05:11.067 EAL: Skipped lcore 130 as core 22 on socket 1 00:05:11.067 EAL: Skipped lcore 131 as core 23 on socket 1 00:05:11.067 EAL: Skipped lcore 132 as core 24 on socket 1 00:05:11.067 EAL: Skipped lcore 133 as core 25 on socket 1 00:05:11.067 EAL: Skipped lcore 134 as core 26 on socket 1 00:05:11.067 EAL: Skipped lcore 135 as core 27 on socket 1 00:05:11.067 EAL: Skipped lcore 136 as core 28 on socket 1 00:05:11.067 EAL: Skipped lcore 137 as core 29 on socket 1 00:05:11.067 EAL: Skipped lcore 138 as core 30 on socket 1 00:05:11.067 EAL: Skipped lcore 139 as core 31 on socket 1 00:05:11.067 EAL: Skipped lcore 140 as core 32 on socket 1 00:05:11.067 EAL: Skipped lcore 141 as core 33 on socket 1 00:05:11.067 EAL: Skipped lcore 142 as core 34 on socket 1 00:05:11.067 EAL: Skipped lcore 143 as core 35 on socket 1 00:05:11.067 EAL: Maximum logical cores by configuration: 128 00:05:11.067 EAL: Detected CPU lcores: 128 00:05:11.067 EAL: Detected NUMA nodes: 2 00:05:11.067 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:11.067 EAL: Detected shared linkage of DPDK 00:05:11.067 EAL: No shared files mode enabled, IPC will be disabled 00:05:11.067 EAL: No shared files mode enabled, IPC is disabled 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.0 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.1 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.2 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.3 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.4 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.5 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.6 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:01.7 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.0 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.1 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.2 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.3 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.4 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.5 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.6 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4d:02.7 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.0 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.1 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.2 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.3 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.4 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.5 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.6 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:01.7 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.0 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.1 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.2 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.3 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.4 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.5 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.6 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:4f:02.7 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.0 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.1 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.2 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.3 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.4 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.5 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.6 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:01.7 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.0 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.1 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.2 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.3 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.4 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.5 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.6 wants IOVA as 'PA' 00:05:11.067 EAL: PCI driver qat for device 0000:51:02.7 wants IOVA as 'PA' 00:05:11.067 EAL: Bus pci wants IOVA as 'PA' 00:05:11.067 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:11.067 EAL: Bus vdev wants IOVA as 'DC' 00:05:11.067 EAL: Selected IOVA mode 'PA' 00:05:11.067 EAL: Probing VFIO support... 00:05:11.067 EAL: IOMMU type 1 (Type 1) is supported 00:05:11.067 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:11.067 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:11.067 EAL: VFIO support initialized 00:05:11.067 EAL: Ask a virtual area of 0x2e000 bytes 00:05:11.067 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:11.067 EAL: Setting up physically contiguous memory... 00:05:11.067 EAL: Setting maximum number of open files to 524288 00:05:11.067 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:11.067 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:11.067 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:11.067 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.067 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:11.067 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.067 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.067 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:11.067 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:11.067 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.067 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:11.067 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.067 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.067 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:11.067 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:11.067 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.067 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:11.067 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.067 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.067 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:11.067 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:11.067 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.067 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:11.067 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:11.068 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.068 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:11.068 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:11.068 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:11.068 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.068 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:11.068 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.068 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.068 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:11.068 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:11.068 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.068 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:11.068 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.068 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.068 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:11.068 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:11.068 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.068 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:11.068 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.068 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.068 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:11.068 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:11.068 EAL: Ask a virtual area of 0x61000 bytes 00:05:11.068 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:11.068 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:11.068 EAL: Ask a virtual area of 0x400000000 bytes 00:05:11.068 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:11.068 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:11.068 EAL: Hugepages will be freed exactly as allocated. 00:05:11.068 EAL: No shared files mode enabled, IPC is disabled 00:05:11.068 EAL: No shared files mode enabled, IPC is disabled 00:05:11.068 EAL: TSC frequency is ~2400000 KHz 00:05:11.068 EAL: Main lcore 0 is ready (tid=7f5b76308b00;cpuset=[0]) 00:05:11.068 EAL: Trying to obtain current memory policy. 00:05:11.068 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.068 EAL: Restoring previous memory policy: 0 00:05:11.068 EAL: request: mp_malloc_sync 00:05:11.068 EAL: No shared files mode enabled, IPC is disabled 00:05:11.068 EAL: Heap on socket 0 was expanded by 2MB 00:05:11.068 EAL: PCI device 0000:4d:01.0 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001000000 00:05:11.068 EAL: PCI memory mapped at 0x202001001000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.0 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.1 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001002000 00:05:11.068 EAL: PCI memory mapped at 0x202001003000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.1 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.2 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001004000 00:05:11.068 EAL: PCI memory mapped at 0x202001005000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.2 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.3 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001006000 00:05:11.068 EAL: PCI memory mapped at 0x202001007000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.3 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.4 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001008000 00:05:11.068 EAL: PCI memory mapped at 0x202001009000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.4 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.5 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200100a000 00:05:11.068 EAL: PCI memory mapped at 0x20200100b000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.5 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.6 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200100c000 00:05:11.068 EAL: PCI memory mapped at 0x20200100d000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.6 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:01.7 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200100e000 00:05:11.068 EAL: PCI memory mapped at 0x20200100f000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.7 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.0 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001010000 00:05:11.068 EAL: PCI memory mapped at 0x202001011000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.0 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.1 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001012000 00:05:11.068 EAL: PCI memory mapped at 0x202001013000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.1 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.2 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001014000 00:05:11.068 EAL: PCI memory mapped at 0x202001015000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.2 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.3 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001016000 00:05:11.068 EAL: PCI memory mapped at 0x202001017000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.3 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.4 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001018000 00:05:11.068 EAL: PCI memory mapped at 0x202001019000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.4 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.5 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200101a000 00:05:11.068 EAL: PCI memory mapped at 0x20200101b000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.5 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.6 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200101c000 00:05:11.068 EAL: PCI memory mapped at 0x20200101d000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.6 (socket 0) 00:05:11.068 EAL: PCI device 0000:4d:02.7 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200101e000 00:05:11.068 EAL: PCI memory mapped at 0x20200101f000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.7 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.0 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001020000 00:05:11.068 EAL: PCI memory mapped at 0x202001021000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.0 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.1 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001022000 00:05:11.068 EAL: PCI memory mapped at 0x202001023000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.1 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.2 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001024000 00:05:11.068 EAL: PCI memory mapped at 0x202001025000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.2 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.3 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001026000 00:05:11.068 EAL: PCI memory mapped at 0x202001027000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.3 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.4 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x202001028000 00:05:11.068 EAL: PCI memory mapped at 0x202001029000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.4 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.5 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200102a000 00:05:11.068 EAL: PCI memory mapped at 0x20200102b000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.5 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.6 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200102c000 00:05:11.068 EAL: PCI memory mapped at 0x20200102d000 00:05:11.068 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.6 (socket 0) 00:05:11.068 EAL: PCI device 0000:4f:01.7 on NUMA socket 0 00:05:11.068 EAL: probe driver: 8086:37c9 qat 00:05:11.068 EAL: PCI memory mapped at 0x20200102e000 00:05:11.068 EAL: PCI memory mapped at 0x20200102f000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.7 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.0 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001030000 00:05:11.069 EAL: PCI memory mapped at 0x202001031000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.0 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.1 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001032000 00:05:11.069 EAL: PCI memory mapped at 0x202001033000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.1 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.2 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001034000 00:05:11.069 EAL: PCI memory mapped at 0x202001035000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.2 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.3 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001036000 00:05:11.069 EAL: PCI memory mapped at 0x202001037000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.3 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.4 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001038000 00:05:11.069 EAL: PCI memory mapped at 0x202001039000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.4 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.5 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200103a000 00:05:11.069 EAL: PCI memory mapped at 0x20200103b000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.5 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.6 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200103c000 00:05:11.069 EAL: PCI memory mapped at 0x20200103d000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.6 (socket 0) 00:05:11.069 EAL: PCI device 0000:4f:02.7 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200103e000 00:05:11.069 EAL: PCI memory mapped at 0x20200103f000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.7 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.0 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001040000 00:05:11.069 EAL: PCI memory mapped at 0x202001041000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.0 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.1 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001042000 00:05:11.069 EAL: PCI memory mapped at 0x202001043000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.1 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.2 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001044000 00:05:11.069 EAL: PCI memory mapped at 0x202001045000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.2 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.3 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001046000 00:05:11.069 EAL: PCI memory mapped at 0x202001047000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.3 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.4 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001048000 00:05:11.069 EAL: PCI memory mapped at 0x202001049000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.4 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.5 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200104a000 00:05:11.069 EAL: PCI memory mapped at 0x20200104b000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.5 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.6 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200104c000 00:05:11.069 EAL: PCI memory mapped at 0x20200104d000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.6 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:01.7 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200104e000 00:05:11.069 EAL: PCI memory mapped at 0x20200104f000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.7 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:02.0 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001050000 00:05:11.069 EAL: PCI memory mapped at 0x202001051000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.0 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:02.1 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001052000 00:05:11.069 EAL: PCI memory mapped at 0x202001053000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.1 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:02.2 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001054000 00:05:11.069 EAL: PCI memory mapped at 0x202001055000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.2 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:02.3 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001056000 00:05:11.069 EAL: PCI memory mapped at 0x202001057000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.3 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:02.4 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x202001058000 00:05:11.069 EAL: PCI memory mapped at 0x202001059000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.4 (socket 0) 00:05:11.069 EAL: PCI device 0000:51:02.5 on NUMA socket 0 00:05:11.069 EAL: probe driver: 8086:37c9 qat 00:05:11.069 EAL: PCI memory mapped at 0x20200105a000 00:05:11.069 EAL: PCI memory mapped at 0x20200105b000 00:05:11.069 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.5 (socket 0) 00:05:11.330 EAL: PCI device 0000:51:02.6 on NUMA socket 0 00:05:11.330 EAL: probe driver: 8086:37c9 qat 00:05:11.330 EAL: PCI memory mapped at 0x20200105c000 00:05:11.330 EAL: PCI memory mapped at 0x20200105d000 00:05:11.330 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.6 (socket 0) 00:05:11.330 EAL: PCI device 0000:51:02.7 on NUMA socket 0 00:05:11.330 EAL: probe driver: 8086:37c9 qat 00:05:11.330 EAL: PCI memory mapped at 0x20200105e000 00:05:11.330 EAL: PCI memory mapped at 0x20200105f000 00:05:11.330 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.7 (socket 0) 00:05:11.330 EAL: No shared files mode enabled, IPC is disabled 00:05:11.330 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:11.331 EAL: Mem event callback 'spdk:(nil)' registered 00:05:11.331 00:05:11.331 00:05:11.331 CUnit - A unit testing framework for C - Version 2.1-3 00:05:11.331 http://cunit.sourceforge.net/ 00:05:11.331 00:05:11.331 00:05:11.331 Suite: components_suite 00:05:11.331 Test: vtophys_malloc_test ...passed 00:05:11.331 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 4MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 4MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 6MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 6MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 10MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 10MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 18MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 18MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 34MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 34MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 66MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 66MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 130MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 130MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.331 EAL: Restoring previous memory policy: 4 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was expanded by 258MB 00:05:11.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.331 EAL: request: mp_malloc_sync 00:05:11.331 EAL: No shared files mode enabled, IPC is disabled 00:05:11.331 EAL: Heap on socket 0 was shrunk by 258MB 00:05:11.331 EAL: Trying to obtain current memory policy. 00:05:11.331 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.592 EAL: Restoring previous memory policy: 4 00:05:11.592 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.592 EAL: request: mp_malloc_sync 00:05:11.592 EAL: No shared files mode enabled, IPC is disabled 00:05:11.592 EAL: Heap on socket 0 was expanded by 514MB 00:05:11.592 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.592 EAL: request: mp_malloc_sync 00:05:11.592 EAL: No shared files mode enabled, IPC is disabled 00:05:11.592 EAL: Heap on socket 0 was shrunk by 514MB 00:05:11.592 EAL: Trying to obtain current memory policy. 00:05:11.592 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:11.853 EAL: Restoring previous memory policy: 4 00:05:11.853 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.853 EAL: request: mp_malloc_sync 00:05:11.853 EAL: No shared files mode enabled, IPC is disabled 00:05:11.853 EAL: Heap on socket 0 was expanded by 1026MB 00:05:11.853 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.853 EAL: request: mp_malloc_sync 00:05:11.853 EAL: No shared files mode enabled, IPC is disabled 00:05:11.853 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:11.853 passed 00:05:11.853 00:05:11.853 Run Summary: Type Total Ran Passed Failed Inactive 00:05:11.853 suites 1 1 n/a 0 0 00:05:11.853 tests 2 2 2 0 0 00:05:11.853 asserts 7570 7570 7570 0 n/a 00:05:11.853 00:05:11.853 Elapsed time = 0.693 seconds 00:05:11.853 EAL: No shared files mode enabled, IPC is disabled 00:05:11.853 EAL: No shared files mode enabled, IPC is disabled 00:05:11.853 EAL: No shared files mode enabled, IPC is disabled 00:05:11.853 00:05:11.853 real 0m0.843s 00:05:11.853 user 0m0.441s 00:05:11.853 sys 0m0.379s 00:05:11.853 13:33:26 env.env_vtophys -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:11.853 13:33:26 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:11.853 ************************************ 00:05:11.853 END TEST env_vtophys 00:05:11.853 ************************************ 00:05:12.114 13:33:26 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:12.114 13:33:26 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:12.114 13:33:26 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:12.114 13:33:26 env -- common/autotest_common.sh@10 -- # set +x 00:05:12.114 ************************************ 00:05:12.114 START TEST env_pci 00:05:12.114 ************************************ 00:05:12.114 13:33:26 env.env_pci -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:12.114 00:05:12.114 00:05:12.114 CUnit - A unit testing framework for C - Version 2.1-3 00:05:12.114 http://cunit.sourceforge.net/ 00:05:12.114 00:05:12.114 00:05:12.114 Suite: pci 00:05:12.114 Test: pci_hook ...[2024-06-10 13:33:26.401711] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1433453 has claimed it 00:05:12.114 EAL: Cannot find device (10000:00:01.0) 00:05:12.114 EAL: Failed to attach device on primary process 00:05:12.114 passed 00:05:12.114 00:05:12.114 Run Summary: Type Total Ran Passed Failed Inactive 00:05:12.114 suites 1 1 n/a 0 0 00:05:12.114 tests 1 1 1 0 0 00:05:12.114 asserts 25 25 25 0 n/a 00:05:12.114 00:05:12.114 Elapsed time = 0.031 seconds 00:05:12.114 00:05:12.114 real 0m0.057s 00:05:12.114 user 0m0.023s 00:05:12.114 sys 0m0.033s 00:05:12.114 13:33:26 env.env_pci -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:12.114 13:33:26 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:12.114 ************************************ 00:05:12.114 END TEST env_pci 00:05:12.114 ************************************ 00:05:12.114 13:33:26 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:12.114 13:33:26 env -- env/env.sh@15 -- # uname 00:05:12.114 13:33:26 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:12.114 13:33:26 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:12.114 13:33:26 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:12.114 13:33:26 env -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:05:12.114 13:33:26 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:12.114 13:33:26 env -- common/autotest_common.sh@10 -- # set +x 00:05:12.114 ************************************ 00:05:12.114 START TEST env_dpdk_post_init 00:05:12.114 ************************************ 00:05:12.114 13:33:26 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:12.114 EAL: Detected CPU lcores: 128 00:05:12.114 EAL: Detected NUMA nodes: 2 00:05:12.114 EAL: Detected shared linkage of DPDK 00:05:12.114 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:12.114 EAL: Selected IOVA mode 'PA' 00:05:12.114 EAL: VFIO support initialized 00:05:12.114 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.0 (socket 0) 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.0_qat_asym 00:05:12.114 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.0_qat_sym 00:05:12.114 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.114 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.1 (socket 0) 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.1_qat_asym 00:05:12.114 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.1_qat_sym 00:05:12.114 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.114 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.2 (socket 0) 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.2_qat_asym 00:05:12.114 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.2_qat_sym 00:05:12.114 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.114 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.3 (socket 0) 00:05:12.114 CRYPTODEV: Creating cryptodev 0000:4d:01.3_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.3_qat_sym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.115 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.4 (socket 0) 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.4_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.4_qat_sym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.115 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.5 (socket 0) 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.5_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.5_qat_sym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.115 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.6 (socket 0) 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.6_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.6_qat_sym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.115 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.7 (socket 0) 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.7_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:01.7_qat_sym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.115 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.0 (socket 0) 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:02.0_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:02.0_qat_sym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.115 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.1 (socket 0) 00:05:12.115 CRYPTODEV: Creating cryptodev 0000:4d:02.1_qat_asym 00:05:12.115 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.376 CRYPTODEV: Creating cryptodev 0000:4d:02.1_qat_sym 00:05:12.376 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.376 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.2 (socket 0) 00:05:12.376 CRYPTODEV: Creating cryptodev 0000:4d:02.2_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.2_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.3 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.3_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.3_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.4 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.4_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.4_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.5 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.5_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.5_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.6 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.6_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.6_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.7 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.7_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4d:02.7_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.0 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.0_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.0_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.1 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.1_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.1_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.2 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.2_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.2_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.3 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.3_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.3_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.4 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.4_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.4_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.5 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.5_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.5_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.6 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.6_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.6_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.7 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.7_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:01.7_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.0 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.0_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.0_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.1 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.1_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.1_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.2 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.2_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.2_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.3 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.3_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.3_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.4 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.4_qat_asym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.4_qat_sym 00:05:12.377 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.377 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.5 (socket 0) 00:05:12.377 CRYPTODEV: Creating cryptodev 0000:4f:02.5_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:4f:02.5_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.6 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:4f:02.6_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:4f:02.6_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.7 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:4f:02.7_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:4f:02.7_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.0 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.0_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.0_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.1 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.1_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.1_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.2 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.2_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.2_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.3 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.3_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.3_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.4 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.4_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.4_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.5 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.5_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.5_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.6 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.6_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.6_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.7 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.7_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:01.7_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.0 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.0_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.0_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.1 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.1_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.1_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.2 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.2_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.2_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.3 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.3_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.3_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.4 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.4_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.4_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.5 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.5_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.5_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.6 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.6_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.6_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.7 (socket 0) 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.7_qat_asym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:12.378 CRYPTODEV: Creating cryptodev 0000:51:02.7_qat_sym 00:05:12.378 CRYPTODEV: Initialisation parameters - name: 0000:51:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:12.378 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:12.378 EAL: Using IOMMU type 1 (Type 1) 00:05:12.378 EAL: Ignore mapping IO port bar(1) 00:05:12.644 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.0 (socket 0) 00:05:12.644 EAL: Ignore mapping IO port bar(1) 00:05:12.906 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.1 (socket 0) 00:05:12.906 EAL: Ignore mapping IO port bar(1) 00:05:12.906 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.2 (socket 0) 00:05:13.167 EAL: Ignore mapping IO port bar(1) 00:05:13.167 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.3 (socket 0) 00:05:13.428 EAL: Ignore mapping IO port bar(1) 00:05:13.428 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.4 (socket 0) 00:05:13.689 EAL: Ignore mapping IO port bar(1) 00:05:13.689 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.5 (socket 0) 00:05:13.689 EAL: Ignore mapping IO port bar(1) 00:05:13.950 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.6 (socket 0) 00:05:13.950 EAL: Ignore mapping IO port bar(1) 00:05:14.212 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.7 (socket 0) 00:05:14.472 EAL: Probe PCI driver: spdk_nvme (144d:a80a) device: 0000:65:00.0 (socket 0) 00:05:14.472 EAL: Ignore mapping IO port bar(1) 00:05:14.472 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.0 (socket 1) 00:05:14.741 EAL: Ignore mapping IO port bar(1) 00:05:14.741 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.1 (socket 1) 00:05:15.001 EAL: Ignore mapping IO port bar(1) 00:05:15.001 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.2 (socket 1) 00:05:15.261 EAL: Ignore mapping IO port bar(1) 00:05:15.261 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.3 (socket 1) 00:05:15.261 EAL: Ignore mapping IO port bar(1) 00:05:15.523 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.4 (socket 1) 00:05:15.523 EAL: Ignore mapping IO port bar(1) 00:05:15.785 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.5 (socket 1) 00:05:15.785 EAL: Ignore mapping IO port bar(1) 00:05:16.047 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.6 (socket 1) 00:05:16.047 EAL: Ignore mapping IO port bar(1) 00:05:16.047 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.7 (socket 1) 00:05:16.047 EAL: Releasing PCI mapped resource for 0000:65:00.0 00:05:16.047 EAL: Calling pci_unmap_resource for 0000:65:00.0 at 0x202001080000 00:05:16.308 Starting DPDK initialization... 00:05:16.308 Starting SPDK post initialization... 00:05:16.308 SPDK NVMe probe 00:05:16.308 Attaching to 0000:65:00.0 00:05:16.308 Attached to 0000:65:00.0 00:05:16.308 Cleaning up... 00:05:18.228 00:05:18.228 real 0m5.767s 00:05:18.228 user 0m0.207s 00:05:18.228 sys 0m0.111s 00:05:18.229 13:33:32 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:18.229 13:33:32 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:18.229 ************************************ 00:05:18.229 END TEST env_dpdk_post_init 00:05:18.229 ************************************ 00:05:18.229 13:33:32 env -- env/env.sh@26 -- # uname 00:05:18.229 13:33:32 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:18.229 13:33:32 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:18.229 13:33:32 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:18.229 13:33:32 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:18.229 13:33:32 env -- common/autotest_common.sh@10 -- # set +x 00:05:18.229 ************************************ 00:05:18.229 START TEST env_mem_callbacks 00:05:18.229 ************************************ 00:05:18.229 13:33:32 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:18.229 EAL: Detected CPU lcores: 128 00:05:18.229 EAL: Detected NUMA nodes: 2 00:05:18.229 EAL: Detected shared linkage of DPDK 00:05:18.229 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:18.229 EAL: Selected IOVA mode 'PA' 00:05:18.229 EAL: VFIO support initialized 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.0 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.0_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.0_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.1 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.1_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.1_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.2 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.2_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.2_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.3 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.3_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.3_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.4 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.4_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.4_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.5 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.5_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.5_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.6 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.6_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.6_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:01.7 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.7_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:01.7_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.0 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.0_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.0_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.1 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.1_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.1_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.2 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.2_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.2_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.3 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.3_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.3_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.4 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.4_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.4_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.5 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.5_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.5_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.6 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.6_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.6_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4d:02.7 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.7_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4d:02.7_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.0 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.0_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.0_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.1 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.1_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.1_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.2 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.2_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.2_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.3 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.3_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.3_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.4 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.4_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.4_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.5 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.5_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.5_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.6 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.6_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.6_qat_sym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.229 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:01.7 (socket 0) 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.7_qat_asym 00:05:18.229 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.229 CRYPTODEV: Creating cryptodev 0000:4f:01.7_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.0 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.0_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.0_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.1 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.1_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.1_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.2 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.2_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.2_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.3 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.3_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.3_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.4 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.4_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.4_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.5 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.5_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.5_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.6 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.6_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.6_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:4f:02.7 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.7_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:4f:02.7_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:4f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.0 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.0_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.0_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.1 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.1_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.1_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.2 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.2_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.2_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.3 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.3_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.3_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.4 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.4_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.4_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.5 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.5_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.5_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.6 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.6_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.6_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:01.7 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.7_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:01.7_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.0 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.0_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.0_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.1 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.1_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.1_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.2 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.2_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.2_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.3 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.3_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.3_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.4 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.4_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.4_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.5 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.5_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.5_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.6 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.6_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.6_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:51:02.7 (socket 0) 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.7_qat_asym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:18.230 CRYPTODEV: Creating cryptodev 0000:51:02.7_qat_sym 00:05:18.230 CRYPTODEV: Initialisation parameters - name: 0000:51:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:18.230 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:18.230 00:05:18.230 00:05:18.230 CUnit - A unit testing framework for C - Version 2.1-3 00:05:18.230 http://cunit.sourceforge.net/ 00:05:18.230 00:05:18.230 00:05:18.230 Suite: memory 00:05:18.230 Test: test ... 00:05:18.230 register 0x200000200000 2097152 00:05:18.230 malloc 3145728 00:05:18.230 register 0x200000400000 4194304 00:05:18.230 buf 0x200000500000 len 3145728 PASSED 00:05:18.230 malloc 64 00:05:18.230 buf 0x2000004fff40 len 64 PASSED 00:05:18.230 malloc 4194304 00:05:18.230 register 0x200000800000 6291456 00:05:18.230 buf 0x200000a00000 len 4194304 PASSED 00:05:18.230 free 0x200000500000 3145728 00:05:18.230 free 0x2000004fff40 64 00:05:18.230 unregister 0x200000400000 4194304 PASSED 00:05:18.230 free 0x200000a00000 4194304 00:05:18.230 unregister 0x200000800000 6291456 PASSED 00:05:18.230 malloc 8388608 00:05:18.230 register 0x200000400000 10485760 00:05:18.230 buf 0x200000600000 len 8388608 PASSED 00:05:18.230 free 0x200000600000 8388608 00:05:18.230 unregister 0x200000400000 10485760 PASSED 00:05:18.230 passed 00:05:18.230 00:05:18.230 Run Summary: Type Total Ran Passed Failed Inactive 00:05:18.230 suites 1 1 n/a 0 0 00:05:18.230 tests 1 1 1 0 0 00:05:18.230 asserts 15 15 15 0 n/a 00:05:18.230 00:05:18.231 Elapsed time = 0.009 seconds 00:05:18.231 00:05:18.231 real 0m0.089s 00:05:18.231 user 0m0.031s 00:05:18.231 sys 0m0.058s 00:05:18.231 13:33:32 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:18.231 13:33:32 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:18.231 ************************************ 00:05:18.231 END TEST env_mem_callbacks 00:05:18.231 ************************************ 00:05:18.231 00:05:18.231 real 0m7.467s 00:05:18.231 user 0m1.085s 00:05:18.231 sys 0m0.934s 00:05:18.231 13:33:32 env -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:18.231 13:33:32 env -- common/autotest_common.sh@10 -- # set +x 00:05:18.231 ************************************ 00:05:18.231 END TEST env 00:05:18.231 ************************************ 00:05:18.231 13:33:32 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:18.231 13:33:32 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:18.231 13:33:32 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:18.231 13:33:32 -- common/autotest_common.sh@10 -- # set +x 00:05:18.231 ************************************ 00:05:18.231 START TEST rpc 00:05:18.231 ************************************ 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:18.231 * Looking for test storage... 00:05:18.231 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:18.231 13:33:32 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1434875 00:05:18.231 13:33:32 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:18.231 13:33:32 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:18.231 13:33:32 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1434875 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@830 -- # '[' -z 1434875 ']' 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:18.231 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:18.231 13:33:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.492 [2024-06-10 13:33:32.745417] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:18.492 [2024-06-10 13:33:32.745479] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1434875 ] 00:05:18.492 [2024-06-10 13:33:32.840458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.492 [2024-06-10 13:33:32.935901] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:18.492 [2024-06-10 13:33:32.935964] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1434875' to capture a snapshot of events at runtime. 00:05:18.492 [2024-06-10 13:33:32.935972] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:18.493 [2024-06-10 13:33:32.935980] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:18.493 [2024-06-10 13:33:32.935986] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1434875 for offline analysis/debug. 00:05:18.493 [2024-06-10 13:33:32.936013] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.437 13:33:33 rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:19.437 13:33:33 rpc -- common/autotest_common.sh@863 -- # return 0 00:05:19.437 13:33:33 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:19.437 13:33:33 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:19.437 13:33:33 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:19.437 13:33:33 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:19.437 13:33:33 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:19.437 13:33:33 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:19.437 13:33:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:19.437 ************************************ 00:05:19.437 START TEST rpc_integrity 00:05:19.437 ************************************ 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:19.437 { 00:05:19.437 "name": "Malloc0", 00:05:19.437 "aliases": [ 00:05:19.437 "6f051e3b-d1e8-4208-be3d-e0bfe3e25434" 00:05:19.437 ], 00:05:19.437 "product_name": "Malloc disk", 00:05:19.437 "block_size": 512, 00:05:19.437 "num_blocks": 16384, 00:05:19.437 "uuid": "6f051e3b-d1e8-4208-be3d-e0bfe3e25434", 00:05:19.437 "assigned_rate_limits": { 00:05:19.437 "rw_ios_per_sec": 0, 00:05:19.437 "rw_mbytes_per_sec": 0, 00:05:19.437 "r_mbytes_per_sec": 0, 00:05:19.437 "w_mbytes_per_sec": 0 00:05:19.437 }, 00:05:19.437 "claimed": false, 00:05:19.437 "zoned": false, 00:05:19.437 "supported_io_types": { 00:05:19.437 "read": true, 00:05:19.437 "write": true, 00:05:19.437 "unmap": true, 00:05:19.437 "write_zeroes": true, 00:05:19.437 "flush": true, 00:05:19.437 "reset": true, 00:05:19.437 "compare": false, 00:05:19.437 "compare_and_write": false, 00:05:19.437 "abort": true, 00:05:19.437 "nvme_admin": false, 00:05:19.437 "nvme_io": false 00:05:19.437 }, 00:05:19.437 "memory_domains": [ 00:05:19.437 { 00:05:19.437 "dma_device_id": "system", 00:05:19.437 "dma_device_type": 1 00:05:19.437 }, 00:05:19.437 { 00:05:19.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.437 "dma_device_type": 2 00:05:19.437 } 00:05:19.437 ], 00:05:19.437 "driver_specific": {} 00:05:19.437 } 00:05:19.437 ]' 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.437 [2024-06-10 13:33:33.795252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:19.437 [2024-06-10 13:33:33.795301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:19.437 [2024-06-10 13:33:33.795317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x211ca70 00:05:19.437 [2024-06-10 13:33:33.795325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:19.437 [2024-06-10 13:33:33.796921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:19.437 [2024-06-10 13:33:33.796957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:19.437 Passthru0 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.437 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.437 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:19.437 { 00:05:19.437 "name": "Malloc0", 00:05:19.437 "aliases": [ 00:05:19.437 "6f051e3b-d1e8-4208-be3d-e0bfe3e25434" 00:05:19.437 ], 00:05:19.437 "product_name": "Malloc disk", 00:05:19.437 "block_size": 512, 00:05:19.437 "num_blocks": 16384, 00:05:19.437 "uuid": "6f051e3b-d1e8-4208-be3d-e0bfe3e25434", 00:05:19.437 "assigned_rate_limits": { 00:05:19.437 "rw_ios_per_sec": 0, 00:05:19.437 "rw_mbytes_per_sec": 0, 00:05:19.437 "r_mbytes_per_sec": 0, 00:05:19.437 "w_mbytes_per_sec": 0 00:05:19.437 }, 00:05:19.437 "claimed": true, 00:05:19.437 "claim_type": "exclusive_write", 00:05:19.437 "zoned": false, 00:05:19.437 "supported_io_types": { 00:05:19.437 "read": true, 00:05:19.437 "write": true, 00:05:19.437 "unmap": true, 00:05:19.437 "write_zeroes": true, 00:05:19.437 "flush": true, 00:05:19.437 "reset": true, 00:05:19.437 "compare": false, 00:05:19.437 "compare_and_write": false, 00:05:19.437 "abort": true, 00:05:19.437 "nvme_admin": false, 00:05:19.437 "nvme_io": false 00:05:19.437 }, 00:05:19.437 "memory_domains": [ 00:05:19.437 { 00:05:19.438 "dma_device_id": "system", 00:05:19.438 "dma_device_type": 1 00:05:19.438 }, 00:05:19.438 { 00:05:19.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.438 "dma_device_type": 2 00:05:19.438 } 00:05:19.438 ], 00:05:19.438 "driver_specific": {} 00:05:19.438 }, 00:05:19.438 { 00:05:19.438 "name": "Passthru0", 00:05:19.438 "aliases": [ 00:05:19.438 "581ef800-f602-5c3c-b9de-6a2bec58b069" 00:05:19.438 ], 00:05:19.438 "product_name": "passthru", 00:05:19.438 "block_size": 512, 00:05:19.438 "num_blocks": 16384, 00:05:19.438 "uuid": "581ef800-f602-5c3c-b9de-6a2bec58b069", 00:05:19.438 "assigned_rate_limits": { 00:05:19.438 "rw_ios_per_sec": 0, 00:05:19.438 "rw_mbytes_per_sec": 0, 00:05:19.438 "r_mbytes_per_sec": 0, 00:05:19.438 "w_mbytes_per_sec": 0 00:05:19.438 }, 00:05:19.438 "claimed": false, 00:05:19.438 "zoned": false, 00:05:19.438 "supported_io_types": { 00:05:19.438 "read": true, 00:05:19.438 "write": true, 00:05:19.438 "unmap": true, 00:05:19.438 "write_zeroes": true, 00:05:19.438 "flush": true, 00:05:19.438 "reset": true, 00:05:19.438 "compare": false, 00:05:19.438 "compare_and_write": false, 00:05:19.438 "abort": true, 00:05:19.438 "nvme_admin": false, 00:05:19.438 "nvme_io": false 00:05:19.438 }, 00:05:19.438 "memory_domains": [ 00:05:19.438 { 00:05:19.438 "dma_device_id": "system", 00:05:19.438 "dma_device_type": 1 00:05:19.438 }, 00:05:19.438 { 00:05:19.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.438 "dma_device_type": 2 00:05:19.438 } 00:05:19.438 ], 00:05:19.438 "driver_specific": { 00:05:19.438 "passthru": { 00:05:19.438 "name": "Passthru0", 00:05:19.438 "base_bdev_name": "Malloc0" 00:05:19.438 } 00:05:19.438 } 00:05:19.438 } 00:05:19.438 ]' 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.438 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:19.438 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:19.700 13:33:33 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:19.700 00:05:19.700 real 0m0.306s 00:05:19.700 user 0m0.187s 00:05:19.700 sys 0m0.048s 00:05:19.700 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:19.700 13:33:33 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:19.700 ************************************ 00:05:19.700 END TEST rpc_integrity 00:05:19.700 ************************************ 00:05:19.700 13:33:33 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:19.700 13:33:33 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:19.700 13:33:33 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:19.700 13:33:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:19.700 ************************************ 00:05:19.700 START TEST rpc_plugins 00:05:19.700 ************************************ 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # rpc_plugins 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:19.700 { 00:05:19.700 "name": "Malloc1", 00:05:19.700 "aliases": [ 00:05:19.700 "3e73b8e4-9204-413a-8504-78d5b78080e3" 00:05:19.700 ], 00:05:19.700 "product_name": "Malloc disk", 00:05:19.700 "block_size": 4096, 00:05:19.700 "num_blocks": 256, 00:05:19.700 "uuid": "3e73b8e4-9204-413a-8504-78d5b78080e3", 00:05:19.700 "assigned_rate_limits": { 00:05:19.700 "rw_ios_per_sec": 0, 00:05:19.700 "rw_mbytes_per_sec": 0, 00:05:19.700 "r_mbytes_per_sec": 0, 00:05:19.700 "w_mbytes_per_sec": 0 00:05:19.700 }, 00:05:19.700 "claimed": false, 00:05:19.700 "zoned": false, 00:05:19.700 "supported_io_types": { 00:05:19.700 "read": true, 00:05:19.700 "write": true, 00:05:19.700 "unmap": true, 00:05:19.700 "write_zeroes": true, 00:05:19.700 "flush": true, 00:05:19.700 "reset": true, 00:05:19.700 "compare": false, 00:05:19.700 "compare_and_write": false, 00:05:19.700 "abort": true, 00:05:19.700 "nvme_admin": false, 00:05:19.700 "nvme_io": false 00:05:19.700 }, 00:05:19.700 "memory_domains": [ 00:05:19.700 { 00:05:19.700 "dma_device_id": "system", 00:05:19.700 "dma_device_type": 1 00:05:19.700 }, 00:05:19.700 { 00:05:19.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:19.700 "dma_device_type": 2 00:05:19.700 } 00:05:19.700 ], 00:05:19.700 "driver_specific": {} 00:05:19.700 } 00:05:19.700 ]' 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:19.700 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:19.700 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:19.961 13:33:34 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:19.961 00:05:19.961 real 0m0.153s 00:05:19.961 user 0m0.091s 00:05:19.961 sys 0m0.023s 00:05:19.961 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:19.961 13:33:34 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:19.961 ************************************ 00:05:19.961 END TEST rpc_plugins 00:05:19.961 ************************************ 00:05:19.961 13:33:34 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:19.961 13:33:34 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:19.961 13:33:34 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:19.961 13:33:34 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:19.961 ************************************ 00:05:19.961 START TEST rpc_trace_cmd_test 00:05:19.961 ************************************ 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # rpc_trace_cmd_test 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:19.961 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:19.961 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1434875", 00:05:19.961 "tpoint_group_mask": "0x8", 00:05:19.961 "iscsi_conn": { 00:05:19.961 "mask": "0x2", 00:05:19.961 "tpoint_mask": "0x0" 00:05:19.961 }, 00:05:19.961 "scsi": { 00:05:19.962 "mask": "0x4", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "bdev": { 00:05:19.962 "mask": "0x8", 00:05:19.962 "tpoint_mask": "0xffffffffffffffff" 00:05:19.962 }, 00:05:19.962 "nvmf_rdma": { 00:05:19.962 "mask": "0x10", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "nvmf_tcp": { 00:05:19.962 "mask": "0x20", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "ftl": { 00:05:19.962 "mask": "0x40", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "blobfs": { 00:05:19.962 "mask": "0x80", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "dsa": { 00:05:19.962 "mask": "0x200", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "thread": { 00:05:19.962 "mask": "0x400", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "nvme_pcie": { 00:05:19.962 "mask": "0x800", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "iaa": { 00:05:19.962 "mask": "0x1000", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "nvme_tcp": { 00:05:19.962 "mask": "0x2000", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "bdev_nvme": { 00:05:19.962 "mask": "0x4000", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 }, 00:05:19.962 "sock": { 00:05:19.962 "mask": "0x8000", 00:05:19.962 "tpoint_mask": "0x0" 00:05:19.962 } 00:05:19.962 }' 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:19.962 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:20.223 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:20.223 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:20.223 13:33:34 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:20.223 00:05:20.223 real 0m0.251s 00:05:20.223 user 0m0.215s 00:05:20.223 sys 0m0.030s 00:05:20.223 13:33:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:20.223 13:33:34 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:20.223 ************************************ 00:05:20.223 END TEST rpc_trace_cmd_test 00:05:20.223 ************************************ 00:05:20.223 13:33:34 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:20.223 13:33:34 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:20.224 13:33:34 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:20.224 13:33:34 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:20.224 13:33:34 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:20.224 13:33:34 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:20.224 ************************************ 00:05:20.224 START TEST rpc_daemon_integrity 00:05:20.224 ************************************ 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:20.224 { 00:05:20.224 "name": "Malloc2", 00:05:20.224 "aliases": [ 00:05:20.224 "6ff67cf8-e527-4121-bdd1-c0f02de058f1" 00:05:20.224 ], 00:05:20.224 "product_name": "Malloc disk", 00:05:20.224 "block_size": 512, 00:05:20.224 "num_blocks": 16384, 00:05:20.224 "uuid": "6ff67cf8-e527-4121-bdd1-c0f02de058f1", 00:05:20.224 "assigned_rate_limits": { 00:05:20.224 "rw_ios_per_sec": 0, 00:05:20.224 "rw_mbytes_per_sec": 0, 00:05:20.224 "r_mbytes_per_sec": 0, 00:05:20.224 "w_mbytes_per_sec": 0 00:05:20.224 }, 00:05:20.224 "claimed": false, 00:05:20.224 "zoned": false, 00:05:20.224 "supported_io_types": { 00:05:20.224 "read": true, 00:05:20.224 "write": true, 00:05:20.224 "unmap": true, 00:05:20.224 "write_zeroes": true, 00:05:20.224 "flush": true, 00:05:20.224 "reset": true, 00:05:20.224 "compare": false, 00:05:20.224 "compare_and_write": false, 00:05:20.224 "abort": true, 00:05:20.224 "nvme_admin": false, 00:05:20.224 "nvme_io": false 00:05:20.224 }, 00:05:20.224 "memory_domains": [ 00:05:20.224 { 00:05:20.224 "dma_device_id": "system", 00:05:20.224 "dma_device_type": 1 00:05:20.224 }, 00:05:20.224 { 00:05:20.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:20.224 "dma_device_type": 2 00:05:20.224 } 00:05:20.224 ], 00:05:20.224 "driver_specific": {} 00:05:20.224 } 00:05:20.224 ]' 00:05:20.224 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:20.485 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:20.485 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:20.485 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.485 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.485 [2024-06-10 13:33:34.733777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:20.485 [2024-06-10 13:33:34.733816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:20.486 [2024-06-10 13:33:34.733834] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x211c6e0 00:05:20.486 [2024-06-10 13:33:34.733842] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:20.486 [2024-06-10 13:33:34.735312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:20.486 [2024-06-10 13:33:34.735345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:20.486 Passthru0 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:20.486 { 00:05:20.486 "name": "Malloc2", 00:05:20.486 "aliases": [ 00:05:20.486 "6ff67cf8-e527-4121-bdd1-c0f02de058f1" 00:05:20.486 ], 00:05:20.486 "product_name": "Malloc disk", 00:05:20.486 "block_size": 512, 00:05:20.486 "num_blocks": 16384, 00:05:20.486 "uuid": "6ff67cf8-e527-4121-bdd1-c0f02de058f1", 00:05:20.486 "assigned_rate_limits": { 00:05:20.486 "rw_ios_per_sec": 0, 00:05:20.486 "rw_mbytes_per_sec": 0, 00:05:20.486 "r_mbytes_per_sec": 0, 00:05:20.486 "w_mbytes_per_sec": 0 00:05:20.486 }, 00:05:20.486 "claimed": true, 00:05:20.486 "claim_type": "exclusive_write", 00:05:20.486 "zoned": false, 00:05:20.486 "supported_io_types": { 00:05:20.486 "read": true, 00:05:20.486 "write": true, 00:05:20.486 "unmap": true, 00:05:20.486 "write_zeroes": true, 00:05:20.486 "flush": true, 00:05:20.486 "reset": true, 00:05:20.486 "compare": false, 00:05:20.486 "compare_and_write": false, 00:05:20.486 "abort": true, 00:05:20.486 "nvme_admin": false, 00:05:20.486 "nvme_io": false 00:05:20.486 }, 00:05:20.486 "memory_domains": [ 00:05:20.486 { 00:05:20.486 "dma_device_id": "system", 00:05:20.486 "dma_device_type": 1 00:05:20.486 }, 00:05:20.486 { 00:05:20.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:20.486 "dma_device_type": 2 00:05:20.486 } 00:05:20.486 ], 00:05:20.486 "driver_specific": {} 00:05:20.486 }, 00:05:20.486 { 00:05:20.486 "name": "Passthru0", 00:05:20.486 "aliases": [ 00:05:20.486 "5241b522-3a9a-5a6e-9dc2-bc296329eae1" 00:05:20.486 ], 00:05:20.486 "product_name": "passthru", 00:05:20.486 "block_size": 512, 00:05:20.486 "num_blocks": 16384, 00:05:20.486 "uuid": "5241b522-3a9a-5a6e-9dc2-bc296329eae1", 00:05:20.486 "assigned_rate_limits": { 00:05:20.486 "rw_ios_per_sec": 0, 00:05:20.486 "rw_mbytes_per_sec": 0, 00:05:20.486 "r_mbytes_per_sec": 0, 00:05:20.486 "w_mbytes_per_sec": 0 00:05:20.486 }, 00:05:20.486 "claimed": false, 00:05:20.486 "zoned": false, 00:05:20.486 "supported_io_types": { 00:05:20.486 "read": true, 00:05:20.486 "write": true, 00:05:20.486 "unmap": true, 00:05:20.486 "write_zeroes": true, 00:05:20.486 "flush": true, 00:05:20.486 "reset": true, 00:05:20.486 "compare": false, 00:05:20.486 "compare_and_write": false, 00:05:20.486 "abort": true, 00:05:20.486 "nvme_admin": false, 00:05:20.486 "nvme_io": false 00:05:20.486 }, 00:05:20.486 "memory_domains": [ 00:05:20.486 { 00:05:20.486 "dma_device_id": "system", 00:05:20.486 "dma_device_type": 1 00:05:20.486 }, 00:05:20.486 { 00:05:20.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:20.486 "dma_device_type": 2 00:05:20.486 } 00:05:20.486 ], 00:05:20.486 "driver_specific": { 00:05:20.486 "passthru": { 00:05:20.486 "name": "Passthru0", 00:05:20.486 "base_bdev_name": "Malloc2" 00:05:20.486 } 00:05:20.486 } 00:05:20.486 } 00:05:20.486 ]' 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:20.486 00:05:20.486 real 0m0.300s 00:05:20.486 user 0m0.191s 00:05:20.486 sys 0m0.041s 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:20.486 13:33:34 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:20.486 ************************************ 00:05:20.486 END TEST rpc_daemon_integrity 00:05:20.486 ************************************ 00:05:20.486 13:33:34 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:20.486 13:33:34 rpc -- rpc/rpc.sh@84 -- # killprocess 1434875 00:05:20.486 13:33:34 rpc -- common/autotest_common.sh@949 -- # '[' -z 1434875 ']' 00:05:20.486 13:33:34 rpc -- common/autotest_common.sh@953 -- # kill -0 1434875 00:05:20.486 13:33:34 rpc -- common/autotest_common.sh@954 -- # uname 00:05:20.486 13:33:34 rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:20.486 13:33:34 rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1434875 00:05:20.747 13:33:34 rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:20.747 13:33:34 rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:20.747 13:33:34 rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1434875' 00:05:20.747 killing process with pid 1434875 00:05:20.747 13:33:34 rpc -- common/autotest_common.sh@968 -- # kill 1434875 00:05:20.747 13:33:34 rpc -- common/autotest_common.sh@973 -- # wait 1434875 00:05:21.008 00:05:21.008 real 0m2.657s 00:05:21.008 user 0m3.479s 00:05:21.008 sys 0m0.787s 00:05:21.008 13:33:35 rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:21.008 13:33:35 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:21.008 ************************************ 00:05:21.008 END TEST rpc 00:05:21.008 ************************************ 00:05:21.008 13:33:35 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:21.008 13:33:35 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:21.008 13:33:35 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:21.008 13:33:35 -- common/autotest_common.sh@10 -- # set +x 00:05:21.008 ************************************ 00:05:21.008 START TEST skip_rpc 00:05:21.008 ************************************ 00:05:21.008 13:33:35 skip_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:21.008 * Looking for test storage... 00:05:21.008 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:21.008 13:33:35 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:21.008 13:33:35 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:21.008 13:33:35 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:21.008 13:33:35 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:21.008 13:33:35 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:21.008 13:33:35 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:21.008 ************************************ 00:05:21.008 START TEST skip_rpc 00:05:21.008 ************************************ 00:05:21.008 13:33:35 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # test_skip_rpc 00:05:21.008 13:33:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1435612 00:05:21.008 13:33:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:21.008 13:33:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:21.009 13:33:35 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:21.271 [2024-06-10 13:33:35.503064] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:21.271 [2024-06-10 13:33:35.503123] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1435612 ] 00:05:21.271 [2024-06-10 13:33:35.597237] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:21.271 [2024-06-10 13:33:35.692283] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@649 -- # local es=0 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@637 -- # local arg=rpc_cmd 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # type -t rpc_cmd 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # rpc_cmd spdk_get_version 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:26.563 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # es=1 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1435612 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@949 -- # '[' -z 1435612 ']' 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # kill -0 1435612 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # uname 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1435612 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1435612' 00:05:26.564 killing process with pid 1435612 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # kill 1435612 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # wait 1435612 00:05:26.564 00:05:26.564 real 0m5.282s 00:05:26.564 user 0m5.001s 00:05:26.564 sys 0m0.306s 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:26.564 13:33:40 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:26.564 ************************************ 00:05:26.564 END TEST skip_rpc 00:05:26.564 ************************************ 00:05:26.564 13:33:40 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:26.564 13:33:40 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:26.564 13:33:40 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:26.564 13:33:40 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:26.564 ************************************ 00:05:26.564 START TEST skip_rpc_with_json 00:05:26.564 ************************************ 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_json 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1436734 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1436734 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@830 -- # '[' -z 1436734 ']' 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:26.564 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:26.564 13:33:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:26.564 [2024-06-10 13:33:40.858818] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:26.564 [2024-06-10 13:33:40.858863] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1436734 ] 00:05:26.564 [2024-06-10 13:33:40.946948] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:26.564 [2024-06-10 13:33:41.013034] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@863 -- # return 0 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:27.506 [2024-06-10 13:33:41.709836] nvmf_rpc.c:2560:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:27.506 request: 00:05:27.506 { 00:05:27.506 "trtype": "tcp", 00:05:27.506 "method": "nvmf_get_transports", 00:05:27.506 "req_id": 1 00:05:27.506 } 00:05:27.506 Got JSON-RPC error response 00:05:27.506 response: 00:05:27.506 { 00:05:27.506 "code": -19, 00:05:27.506 "message": "No such device" 00:05:27.506 } 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:27.506 [2024-06-10 13:33:41.721954] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:27.506 13:33:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:27.506 { 00:05:27.506 "subsystems": [ 00:05:27.506 { 00:05:27.506 "subsystem": "keyring", 00:05:27.506 "config": [] 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "subsystem": "iobuf", 00:05:27.506 "config": [ 00:05:27.506 { 00:05:27.506 "method": "iobuf_set_options", 00:05:27.506 "params": { 00:05:27.506 "small_pool_count": 8192, 00:05:27.506 "large_pool_count": 1024, 00:05:27.506 "small_bufsize": 8192, 00:05:27.506 "large_bufsize": 135168 00:05:27.506 } 00:05:27.506 } 00:05:27.506 ] 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "subsystem": "sock", 00:05:27.506 "config": [ 00:05:27.506 { 00:05:27.506 "method": "sock_set_default_impl", 00:05:27.506 "params": { 00:05:27.506 "impl_name": "posix" 00:05:27.506 } 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "method": "sock_impl_set_options", 00:05:27.506 "params": { 00:05:27.506 "impl_name": "ssl", 00:05:27.506 "recv_buf_size": 4096, 00:05:27.506 "send_buf_size": 4096, 00:05:27.506 "enable_recv_pipe": true, 00:05:27.506 "enable_quickack": false, 00:05:27.506 "enable_placement_id": 0, 00:05:27.506 "enable_zerocopy_send_server": true, 00:05:27.506 "enable_zerocopy_send_client": false, 00:05:27.506 "zerocopy_threshold": 0, 00:05:27.506 "tls_version": 0, 00:05:27.506 "enable_ktls": false, 00:05:27.506 "enable_new_session_tickets": true 00:05:27.506 } 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "method": "sock_impl_set_options", 00:05:27.506 "params": { 00:05:27.506 "impl_name": "posix", 00:05:27.506 "recv_buf_size": 2097152, 00:05:27.506 "send_buf_size": 2097152, 00:05:27.506 "enable_recv_pipe": true, 00:05:27.506 "enable_quickack": false, 00:05:27.506 "enable_placement_id": 0, 00:05:27.506 "enable_zerocopy_send_server": true, 00:05:27.506 "enable_zerocopy_send_client": false, 00:05:27.506 "zerocopy_threshold": 0, 00:05:27.506 "tls_version": 0, 00:05:27.506 "enable_ktls": false, 00:05:27.506 "enable_new_session_tickets": false 00:05:27.506 } 00:05:27.506 } 00:05:27.506 ] 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "subsystem": "vmd", 00:05:27.506 "config": [] 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "subsystem": "accel", 00:05:27.506 "config": [ 00:05:27.506 { 00:05:27.506 "method": "accel_set_options", 00:05:27.506 "params": { 00:05:27.506 "small_cache_size": 128, 00:05:27.506 "large_cache_size": 16, 00:05:27.506 "task_count": 2048, 00:05:27.506 "sequence_count": 2048, 00:05:27.506 "buf_count": 2048 00:05:27.506 } 00:05:27.506 } 00:05:27.506 ] 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "subsystem": "bdev", 00:05:27.506 "config": [ 00:05:27.506 { 00:05:27.506 "method": "bdev_set_options", 00:05:27.506 "params": { 00:05:27.506 "bdev_io_pool_size": 65535, 00:05:27.506 "bdev_io_cache_size": 256, 00:05:27.506 "bdev_auto_examine": true, 00:05:27.506 "iobuf_small_cache_size": 128, 00:05:27.506 "iobuf_large_cache_size": 16 00:05:27.506 } 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "method": "bdev_raid_set_options", 00:05:27.506 "params": { 00:05:27.506 "process_window_size_kb": 1024 00:05:27.506 } 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "method": "bdev_iscsi_set_options", 00:05:27.506 "params": { 00:05:27.506 "timeout_sec": 30 00:05:27.506 } 00:05:27.506 }, 00:05:27.506 { 00:05:27.506 "method": "bdev_nvme_set_options", 00:05:27.506 "params": { 00:05:27.506 "action_on_timeout": "none", 00:05:27.506 "timeout_us": 0, 00:05:27.506 "timeout_admin_us": 0, 00:05:27.506 "keep_alive_timeout_ms": 10000, 00:05:27.506 "arbitration_burst": 0, 00:05:27.506 "low_priority_weight": 0, 00:05:27.506 "medium_priority_weight": 0, 00:05:27.506 "high_priority_weight": 0, 00:05:27.506 "nvme_adminq_poll_period_us": 10000, 00:05:27.506 "nvme_ioq_poll_period_us": 0, 00:05:27.506 "io_queue_requests": 0, 00:05:27.506 "delay_cmd_submit": true, 00:05:27.506 "transport_retry_count": 4, 00:05:27.506 "bdev_retry_count": 3, 00:05:27.506 "transport_ack_timeout": 0, 00:05:27.506 "ctrlr_loss_timeout_sec": 0, 00:05:27.506 "reconnect_delay_sec": 0, 00:05:27.506 "fast_io_fail_timeout_sec": 0, 00:05:27.506 "disable_auto_failback": false, 00:05:27.506 "generate_uuids": false, 00:05:27.506 "transport_tos": 0, 00:05:27.506 "nvme_error_stat": false, 00:05:27.506 "rdma_srq_size": 0, 00:05:27.506 "io_path_stat": false, 00:05:27.506 "allow_accel_sequence": false, 00:05:27.506 "rdma_max_cq_size": 0, 00:05:27.507 "rdma_cm_event_timeout_ms": 0, 00:05:27.507 "dhchap_digests": [ 00:05:27.507 "sha256", 00:05:27.507 "sha384", 00:05:27.507 "sha512" 00:05:27.507 ], 00:05:27.507 "dhchap_dhgroups": [ 00:05:27.507 "null", 00:05:27.507 "ffdhe2048", 00:05:27.507 "ffdhe3072", 00:05:27.507 "ffdhe4096", 00:05:27.507 "ffdhe6144", 00:05:27.507 "ffdhe8192" 00:05:27.507 ] 00:05:27.507 } 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "method": "bdev_nvme_set_hotplug", 00:05:27.507 "params": { 00:05:27.507 "period_us": 100000, 00:05:27.507 "enable": false 00:05:27.507 } 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "method": "bdev_wait_for_examine" 00:05:27.507 } 00:05:27.507 ] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "scsi", 00:05:27.507 "config": null 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "scheduler", 00:05:27.507 "config": [ 00:05:27.507 { 00:05:27.507 "method": "framework_set_scheduler", 00:05:27.507 "params": { 00:05:27.507 "name": "static" 00:05:27.507 } 00:05:27.507 } 00:05:27.507 ] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "vhost_scsi", 00:05:27.507 "config": [] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "vhost_blk", 00:05:27.507 "config": [] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "ublk", 00:05:27.507 "config": [] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "nbd", 00:05:27.507 "config": [] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "nvmf", 00:05:27.507 "config": [ 00:05:27.507 { 00:05:27.507 "method": "nvmf_set_config", 00:05:27.507 "params": { 00:05:27.507 "discovery_filter": "match_any", 00:05:27.507 "admin_cmd_passthru": { 00:05:27.507 "identify_ctrlr": false 00:05:27.507 } 00:05:27.507 } 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "method": "nvmf_set_max_subsystems", 00:05:27.507 "params": { 00:05:27.507 "max_subsystems": 1024 00:05:27.507 } 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "method": "nvmf_set_crdt", 00:05:27.507 "params": { 00:05:27.507 "crdt1": 0, 00:05:27.507 "crdt2": 0, 00:05:27.507 "crdt3": 0 00:05:27.507 } 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "method": "nvmf_create_transport", 00:05:27.507 "params": { 00:05:27.507 "trtype": "TCP", 00:05:27.507 "max_queue_depth": 128, 00:05:27.507 "max_io_qpairs_per_ctrlr": 127, 00:05:27.507 "in_capsule_data_size": 4096, 00:05:27.507 "max_io_size": 131072, 00:05:27.507 "io_unit_size": 131072, 00:05:27.507 "max_aq_depth": 128, 00:05:27.507 "num_shared_buffers": 511, 00:05:27.507 "buf_cache_size": 4294967295, 00:05:27.507 "dif_insert_or_strip": false, 00:05:27.507 "zcopy": false, 00:05:27.507 "c2h_success": true, 00:05:27.507 "sock_priority": 0, 00:05:27.507 "abort_timeout_sec": 1, 00:05:27.507 "ack_timeout": 0, 00:05:27.507 "data_wr_pool_size": 0 00:05:27.507 } 00:05:27.507 } 00:05:27.507 ] 00:05:27.507 }, 00:05:27.507 { 00:05:27.507 "subsystem": "iscsi", 00:05:27.507 "config": [ 00:05:27.507 { 00:05:27.507 "method": "iscsi_set_options", 00:05:27.507 "params": { 00:05:27.507 "node_base": "iqn.2016-06.io.spdk", 00:05:27.507 "max_sessions": 128, 00:05:27.507 "max_connections_per_session": 2, 00:05:27.507 "max_queue_depth": 64, 00:05:27.507 "default_time2wait": 2, 00:05:27.507 "default_time2retain": 20, 00:05:27.507 "first_burst_length": 8192, 00:05:27.507 "immediate_data": true, 00:05:27.507 "allow_duplicated_isid": false, 00:05:27.507 "error_recovery_level": 0, 00:05:27.507 "nop_timeout": 60, 00:05:27.507 "nop_in_interval": 30, 00:05:27.507 "disable_chap": false, 00:05:27.507 "require_chap": false, 00:05:27.507 "mutual_chap": false, 00:05:27.507 "chap_group": 0, 00:05:27.507 "max_large_datain_per_connection": 64, 00:05:27.507 "max_r2t_per_connection": 4, 00:05:27.507 "pdu_pool_size": 36864, 00:05:27.507 "immediate_data_pool_size": 16384, 00:05:27.507 "data_out_pool_size": 2048 00:05:27.507 } 00:05:27.507 } 00:05:27.507 ] 00:05:27.507 } 00:05:27.507 ] 00:05:27.507 } 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1436734 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 1436734 ']' 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 1436734 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1436734 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1436734' 00:05:27.507 killing process with pid 1436734 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 1436734 00:05:27.507 13:33:41 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 1436734 00:05:27.777 13:33:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1436993 00:05:27.777 13:33:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:27.777 13:33:42 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1436993 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 1436993 ']' 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 1436993 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1436993 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1436993' 00:05:33.070 killing process with pid 1436993 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 1436993 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 1436993 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:33.070 00:05:33.070 real 0m6.630s 00:05:33.070 user 0m6.532s 00:05:33.070 sys 0m0.572s 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:33.070 ************************************ 00:05:33.070 END TEST skip_rpc_with_json 00:05:33.070 ************************************ 00:05:33.070 13:33:47 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:33.070 13:33:47 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:33.070 13:33:47 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:33.070 13:33:47 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:33.070 ************************************ 00:05:33.070 START TEST skip_rpc_with_delay 00:05:33.070 ************************************ 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_delay 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@649 -- # local es=0 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:33.070 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:33.330 [2024-06-10 13:33:47.571046] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:33.330 [2024-06-10 13:33:47.571130] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:33.330 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # es=1 00:05:33.330 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:05:33.330 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:05:33.330 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:05:33.330 00:05:33.330 real 0m0.081s 00:05:33.330 user 0m0.053s 00:05:33.330 sys 0m0.028s 00:05:33.330 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:33.330 13:33:47 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:33.330 ************************************ 00:05:33.330 END TEST skip_rpc_with_delay 00:05:33.330 ************************************ 00:05:33.330 13:33:47 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:33.330 13:33:47 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:33.330 13:33:47 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:33.330 13:33:47 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:33.330 13:33:47 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:33.330 13:33:47 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:33.330 ************************************ 00:05:33.330 START TEST exit_on_failed_rpc_init 00:05:33.330 ************************************ 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # test_exit_on_failed_rpc_init 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1438119 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1438119 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@830 -- # '[' -z 1438119 ']' 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:33.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:33.330 13:33:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:33.330 [2024-06-10 13:33:47.729126] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:33.330 [2024-06-10 13:33:47.729179] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1438119 ] 00:05:33.590 [2024-06-10 13:33:47.817733] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.590 [2024-06-10 13:33:47.883343] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@863 -- # return 0 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@649 -- # local es=0 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:34.227 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:34.227 [2024-06-10 13:33:48.647905] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:34.227 [2024-06-10 13:33:48.647952] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1438245 ] 00:05:34.538 [2024-06-10 13:33:48.716939] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.538 [2024-06-10 13:33:48.781630] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:05:34.538 [2024-06-10 13:33:48.781691] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:34.538 [2024-06-10 13:33:48.781701] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:34.538 [2024-06-10 13:33:48.781708] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # es=234 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # es=106 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # case "$es" in 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@669 -- # es=1 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1438119 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@949 -- # '[' -z 1438119 ']' 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # kill -0 1438119 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # uname 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1438119 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1438119' 00:05:34.538 killing process with pid 1438119 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # kill 1438119 00:05:34.538 13:33:48 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # wait 1438119 00:05:34.797 00:05:34.797 real 0m1.440s 00:05:34.797 user 0m1.727s 00:05:34.797 sys 0m0.400s 00:05:34.797 13:33:49 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:34.797 13:33:49 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:34.797 ************************************ 00:05:34.797 END TEST exit_on_failed_rpc_init 00:05:34.797 ************************************ 00:05:34.797 13:33:49 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:34.797 00:05:34.797 real 0m13.851s 00:05:34.797 user 0m13.463s 00:05:34.797 sys 0m1.596s 00:05:34.797 13:33:49 skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:34.797 13:33:49 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:34.797 ************************************ 00:05:34.797 END TEST skip_rpc 00:05:34.797 ************************************ 00:05:34.797 13:33:49 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:34.797 13:33:49 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:34.797 13:33:49 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:34.797 13:33:49 -- common/autotest_common.sh@10 -- # set +x 00:05:34.797 ************************************ 00:05:34.797 START TEST rpc_client 00:05:34.797 ************************************ 00:05:34.797 13:33:49 rpc_client -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:35.058 * Looking for test storage... 00:05:35.058 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:35.058 13:33:49 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:35.058 OK 00:05:35.058 13:33:49 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:35.058 00:05:35.058 real 0m0.128s 00:05:35.058 user 0m0.058s 00:05:35.058 sys 0m0.078s 00:05:35.058 13:33:49 rpc_client -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:35.058 13:33:49 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:35.058 ************************************ 00:05:35.058 END TEST rpc_client 00:05:35.058 ************************************ 00:05:35.058 13:33:49 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:35.058 13:33:49 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:35.058 13:33:49 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:35.058 13:33:49 -- common/autotest_common.sh@10 -- # set +x 00:05:35.058 ************************************ 00:05:35.058 START TEST json_config 00:05:35.058 ************************************ 00:05:35.058 13:33:49 json_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:806f5428-4aec-ec11-9bc7-a4bf01928306 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=806f5428-4aec-ec11-9bc7-a4bf01928306 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:35.058 13:33:49 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:35.058 13:33:49 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:35.058 13:33:49 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:35.058 13:33:49 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.058 13:33:49 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.058 13:33:49 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.058 13:33:49 json_config -- paths/export.sh@5 -- # export PATH 00:05:35.058 13:33:49 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@47 -- # : 0 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:35.058 13:33:49 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:35.058 13:33:49 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:05:35.059 INFO: JSON configuration test init 00:05:35.059 13:33:49 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:05:35.059 13:33:49 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:05:35.059 13:33:49 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:35.059 13:33:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:35.059 13:33:49 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:05:35.059 13:33:49 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:35.059 13:33:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:35.059 13:33:49 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:05:35.319 13:33:49 json_config -- json_config/common.sh@9 -- # local app=target 00:05:35.319 13:33:49 json_config -- json_config/common.sh@10 -- # shift 00:05:35.319 13:33:49 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:35.319 13:33:49 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:35.319 13:33:49 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:35.319 13:33:49 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:35.319 13:33:49 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:35.319 13:33:49 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1438576 00:05:35.319 13:33:49 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:35.319 Waiting for target to run... 00:05:35.319 13:33:49 json_config -- json_config/common.sh@25 -- # waitforlisten 1438576 /var/tmp/spdk_tgt.sock 00:05:35.319 13:33:49 json_config -- common/autotest_common.sh@830 -- # '[' -z 1438576 ']' 00:05:35.319 13:33:49 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:35.319 13:33:49 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:35.319 13:33:49 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:35.319 13:33:49 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:35.319 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:35.319 13:33:49 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:35.319 13:33:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:35.319 [2024-06-10 13:33:49.607308] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:35.319 [2024-06-10 13:33:49.607375] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1438576 ] 00:05:35.580 [2024-06-10 13:33:49.977396] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.580 [2024-06-10 13:33:50.034951] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.150 13:33:50 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:36.150 13:33:50 json_config -- common/autotest_common.sh@863 -- # return 0 00:05:36.150 13:33:50 json_config -- json_config/common.sh@26 -- # echo '' 00:05:36.150 00:05:36.150 13:33:50 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:05:36.150 13:33:50 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:05:36.150 13:33:50 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:36.150 13:33:50 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:36.150 13:33:50 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:05:36.150 13:33:50 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:36.150 13:33:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:36.410 13:33:50 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:36.410 13:33:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:36.410 [2024-06-10 13:33:50.845251] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:36.410 13:33:50 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:36.410 13:33:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:36.670 [2024-06-10 13:33:51.033703] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:36.670 13:33:51 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:05:36.670 13:33:51 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:36.670 13:33:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:36.670 13:33:51 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:36.670 13:33:51 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:05:36.670 13:33:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:36.930 [2024-06-10 13:33:51.294459] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:39.474 13:33:53 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:05:39.474 13:33:53 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:39.474 13:33:53 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:39.474 13:33:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:39.474 13:33:53 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:39.475 13:33:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:05:39.475 13:33:53 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:39.475 13:33:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@55 -- # return 0 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:05:39.475 13:33:53 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:05:39.475 13:33:53 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:39.475 13:33:53 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:39.735 13:33:53 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:39.735 13:33:53 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:39.735 13:33:54 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:39.735 13:33:54 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:39.735 13:33:54 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:39.735 13:33:54 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:05:39.735 13:33:54 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:05:39.735 13:33:54 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:39.735 13:33:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:39.995 Nvme0n1p0 Nvme0n1p1 00:05:39.995 13:33:54 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:39.995 13:33:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:40.255 [2024-06-10 13:33:54.553516] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:40.255 [2024-06-10 13:33:54.553558] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:40.255 00:05:40.255 13:33:54 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:40.255 13:33:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:40.515 Malloc3 00:05:40.515 13:33:54 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:40.515 13:33:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:40.515 [2024-06-10 13:33:54.942563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:40.515 [2024-06-10 13:33:54.942598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:40.515 [2024-06-10 13:33:54.942612] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc6d1b0 00:05:40.515 [2024-06-10 13:33:54.942619] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:40.515 [2024-06-10 13:33:54.943914] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:40.515 [2024-06-10 13:33:54.943933] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:40.515 PTBdevFromMalloc3 00:05:40.515 13:33:54 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:40.515 13:33:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:40.776 Null0 00:05:40.776 13:33:55 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:40.776 13:33:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:41.037 Malloc0 00:05:41.037 13:33:55 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:41.037 13:33:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:41.314 Malloc1 00:05:41.314 13:33:55 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:41.314 13:33:55 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:41.314 102400+0 records in 00:05:41.314 102400+0 records out 00:05:41.314 104857600 bytes (105 MB, 100 MiB) copied, 0.124255 s, 844 MB/s 00:05:41.314 13:33:55 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:41.314 13:33:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:41.575 aio_disk 00:05:41.575 13:33:55 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:41.575 13:33:55 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:41.575 13:33:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:43.488 f462d37b-21b9-4a3b-8b72-590a32232886 00:05:43.488 13:33:57 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:05:43.488 13:33:57 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:05:43.488 13:33:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:05:43.488 13:33:57 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:05:43.488 13:33:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:05:43.488 13:33:57 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:43.488 13:33:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:43.748 13:33:58 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:43.748 13:33:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:44.008 13:33:58 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:05:44.008 13:33:58 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:44.008 13:33:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:44.008 MallocForCryptoBdev 00:05:44.008 13:33:58 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:05:44.008 13:33:58 json_config -- json_config/json_config.sh@159 -- # wc -l 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:44.269 13:33:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:44.269 [2024-06-10 13:33:58.689671] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:05:44.269 CryptoMallocBdev 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:906b01c8-6b5f-4c09-86bc-f14dddf1acfb bdev_register:6b2965e4-2e49-4d92-a8c7-f11303c425f5 bdev_register:385da253-147f-41cc-8cdc-3a0e149419ab bdev_register:63c7eca1-d86e-41ea-90c0-68ac2b13e30e bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:906b01c8-6b5f-4c09-86bc-f14dddf1acfb bdev_register:6b2965e4-2e49-4d92-a8c7-f11303c425f5 bdev_register:385da253-147f-41cc-8cdc-3a0e149419ab bdev_register:63c7eca1-d86e-41ea-90c0-68ac2b13e30e bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@71 -- # sort 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@72 -- # sort 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:44.269 13:33:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:44.269 13:33:58 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:906b01c8-6b5f-4c09-86bc-f14dddf1acfb 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:6b2965e4-2e49-4d92-a8c7-f11303c425f5 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:385da253-147f-41cc-8cdc-3a0e149419ab 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.530 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:63c7eca1-d86e-41ea-90c0-68ac2b13e30e 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:385da253-147f-41cc-8cdc-3a0e149419ab bdev_register:63c7eca1-d86e-41ea-90c0-68ac2b13e30e bdev_register:6b2965e4-2e49-4d92-a8c7-f11303c425f5 bdev_register:906b01c8-6b5f-4c09-86bc-f14dddf1acfb bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\8\5\d\a\2\5\3\-\1\4\7\f\-\4\1\c\c\-\8\c\d\c\-\3\a\0\e\1\4\9\4\1\9\a\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\3\c\7\e\c\a\1\-\d\8\6\e\-\4\1\e\a\-\9\0\c\0\-\6\8\a\c\2\b\1\3\e\3\0\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\b\2\9\6\5\e\4\-\2\e\4\9\-\4\d\9\2\-\a\8\c\7\-\f\1\1\3\0\3\c\4\2\5\f\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\0\6\b\0\1\c\8\-\6\b\5\f\-\4\c\0\9\-\8\6\b\c\-\f\1\4\d\d\d\f\1\a\c\f\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@86 -- # cat 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:385da253-147f-41cc-8cdc-3a0e149419ab bdev_register:63c7eca1-d86e-41ea-90c0-68ac2b13e30e bdev_register:6b2965e4-2e49-4d92-a8c7-f11303c425f5 bdev_register:906b01c8-6b5f-4c09-86bc-f14dddf1acfb bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:05:44.531 Expected events matched: 00:05:44.531 bdev_register:385da253-147f-41cc-8cdc-3a0e149419ab 00:05:44.531 bdev_register:63c7eca1-d86e-41ea-90c0-68ac2b13e30e 00:05:44.531 bdev_register:6b2965e4-2e49-4d92-a8c7-f11303c425f5 00:05:44.531 bdev_register:906b01c8-6b5f-4c09-86bc-f14dddf1acfb 00:05:44.531 bdev_register:aio_disk 00:05:44.531 bdev_register:CryptoMallocBdev 00:05:44.531 bdev_register:Malloc0 00:05:44.531 bdev_register:Malloc0p0 00:05:44.531 bdev_register:Malloc0p1 00:05:44.531 bdev_register:Malloc0p2 00:05:44.531 bdev_register:Malloc1 00:05:44.531 bdev_register:Malloc3 00:05:44.531 bdev_register:MallocForCryptoBdev 00:05:44.531 bdev_register:Null0 00:05:44.531 bdev_register:Nvme0n1 00:05:44.531 bdev_register:Nvme0n1p0 00:05:44.531 bdev_register:Nvme0n1p1 00:05:44.531 bdev_register:PTBdevFromMalloc3 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:05:44.531 13:33:58 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:44.531 13:33:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:05:44.531 13:33:58 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:05:44.531 13:33:58 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:44.531 13:33:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:44.792 13:33:59 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:05:44.792 13:33:59 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:44.792 13:33:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:44.792 MallocBdevForConfigChangeCheck 00:05:44.792 13:33:59 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:05:44.792 13:33:59 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:44.792 13:33:59 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:44.792 13:33:59 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:05:44.792 13:33:59 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:45.365 13:33:59 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:05:45.365 INFO: shutting down applications... 00:05:45.365 13:33:59 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:05:45.365 13:33:59 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:05:45.365 13:33:59 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:05:45.365 13:33:59 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:05:45.365 [2024-06-10 13:33:59.768904] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:05:47.279 Calling clear_iscsi_subsystem 00:05:47.279 Calling clear_nvmf_subsystem 00:05:47.279 Calling clear_nbd_subsystem 00:05:47.279 Calling clear_ublk_subsystem 00:05:47.279 Calling clear_vhost_blk_subsystem 00:05:47.279 Calling clear_vhost_scsi_subsystem 00:05:47.279 Calling clear_bdev_subsystem 00:05:47.540 13:34:01 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:05:47.540 13:34:01 json_config -- json_config/json_config.sh@343 -- # count=100 00:05:47.540 13:34:01 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:05:47.540 13:34:01 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:47.540 13:34:01 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:05:47.540 13:34:01 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:05:47.803 13:34:02 json_config -- json_config/json_config.sh@345 -- # break 00:05:47.803 13:34:02 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:05:47.803 13:34:02 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:05:47.803 13:34:02 json_config -- json_config/common.sh@31 -- # local app=target 00:05:47.803 13:34:02 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:47.803 13:34:02 json_config -- json_config/common.sh@35 -- # [[ -n 1438576 ]] 00:05:47.803 13:34:02 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1438576 00:05:47.803 13:34:02 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:47.803 13:34:02 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:47.803 13:34:02 json_config -- json_config/common.sh@41 -- # kill -0 1438576 00:05:47.803 13:34:02 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:05:48.377 13:34:02 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:05:48.377 13:34:02 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:48.377 13:34:02 json_config -- json_config/common.sh@41 -- # kill -0 1438576 00:05:48.377 13:34:02 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:48.377 13:34:02 json_config -- json_config/common.sh@43 -- # break 00:05:48.377 13:34:02 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:48.377 13:34:02 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:48.377 SPDK target shutdown done 00:05:48.377 13:34:02 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:05:48.377 INFO: relaunching applications... 00:05:48.377 13:34:02 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:48.377 13:34:02 json_config -- json_config/common.sh@9 -- # local app=target 00:05:48.377 13:34:02 json_config -- json_config/common.sh@10 -- # shift 00:05:48.377 13:34:02 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:48.377 13:34:02 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:48.377 13:34:02 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:48.377 13:34:02 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:48.377 13:34:02 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:48.377 13:34:02 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1441364 00:05:48.377 13:34:02 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:48.377 Waiting for target to run... 00:05:48.377 13:34:02 json_config -- json_config/common.sh@25 -- # waitforlisten 1441364 /var/tmp/spdk_tgt.sock 00:05:48.377 13:34:02 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:48.377 13:34:02 json_config -- common/autotest_common.sh@830 -- # '[' -z 1441364 ']' 00:05:48.377 13:34:02 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:48.377 13:34:02 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:48.377 13:34:02 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:48.377 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:48.377 13:34:02 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:48.377 13:34:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:48.377 [2024-06-10 13:34:02.674953] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:48.377 [2024-06-10 13:34:02.675006] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1441364 ] 00:05:48.638 [2024-06-10 13:34:02.971484] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.638 [2024-06-10 13:34:03.025176] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.638 [2024-06-10 13:34:03.079041] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:05:48.638 [2024-06-10 13:34:03.087079] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:48.638 [2024-06-10 13:34:03.095093] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:48.899 [2024-06-10 13:34:03.175564] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:50.814 [2024-06-10 13:34:05.284863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:50.814 [2024-06-10 13:34:05.284907] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:05:50.814 [2024-06-10 13:34:05.284917] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:51.074 [2024-06-10 13:34:05.292874] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:51.074 [2024-06-10 13:34:05.292894] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:51.074 [2024-06-10 13:34:05.300890] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:51.074 [2024-06-10 13:34:05.300907] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:51.074 [2024-06-10 13:34:05.308921] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:05:51.074 [2024-06-10 13:34:05.308939] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:05:51.074 [2024-06-10 13:34:05.308946] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:51.335 [2024-06-10 13:34:05.642368] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:51.335 [2024-06-10 13:34:05.642404] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:51.335 [2024-06-10 13:34:05.642415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x154de10 00:05:51.335 [2024-06-10 13:34:05.642421] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:51.335 [2024-06-10 13:34:05.642669] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:51.335 [2024-06-10 13:34:05.642683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:51.335 13:34:05 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:51.335 13:34:05 json_config -- common/autotest_common.sh@863 -- # return 0 00:05:51.335 13:34:05 json_config -- json_config/common.sh@26 -- # echo '' 00:05:51.335 00:05:51.335 13:34:05 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:05:51.335 13:34:05 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:05:51.335 INFO: Checking if target configuration is the same... 00:05:51.335 13:34:05 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:51.335 13:34:05 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:05:51.335 13:34:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:51.335 + '[' 2 -ne 2 ']' 00:05:51.335 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:51.335 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:51.335 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:51.335 +++ basename /dev/fd/62 00:05:51.335 ++ mktemp /tmp/62.XXX 00:05:51.335 + tmp_file_1=/tmp/62.I4d 00:05:51.335 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:51.335 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:51.335 + tmp_file_2=/tmp/spdk_tgt_config.json.aFr 00:05:51.335 + ret=0 00:05:51.335 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:51.907 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:51.907 + diff -u /tmp/62.I4d /tmp/spdk_tgt_config.json.aFr 00:05:51.907 + echo 'INFO: JSON config files are the same' 00:05:51.907 INFO: JSON config files are the same 00:05:51.907 + rm /tmp/62.I4d /tmp/spdk_tgt_config.json.aFr 00:05:51.907 + exit 0 00:05:51.907 13:34:06 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:05:51.907 13:34:06 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:05:51.907 INFO: changing configuration and checking if this can be detected... 00:05:51.907 13:34:06 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:51.907 13:34:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:51.907 13:34:06 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:05:51.907 13:34:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:51.907 13:34:06 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:51.907 + '[' 2 -ne 2 ']' 00:05:51.907 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:51.907 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:51.907 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:51.907 +++ basename /dev/fd/62 00:05:51.907 ++ mktemp /tmp/62.XXX 00:05:51.907 + tmp_file_1=/tmp/62.m7a 00:05:51.907 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:51.907 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:51.907 + tmp_file_2=/tmp/spdk_tgt_config.json.TTJ 00:05:51.907 + ret=0 00:05:51.907 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:52.479 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:52.479 + diff -u /tmp/62.m7a /tmp/spdk_tgt_config.json.TTJ 00:05:52.479 + ret=1 00:05:52.479 + echo '=== Start of file: /tmp/62.m7a ===' 00:05:52.479 + cat /tmp/62.m7a 00:05:52.479 + echo '=== End of file: /tmp/62.m7a ===' 00:05:52.479 + echo '' 00:05:52.479 + echo '=== Start of file: /tmp/spdk_tgt_config.json.TTJ ===' 00:05:52.479 + cat /tmp/spdk_tgt_config.json.TTJ 00:05:52.479 + echo '=== End of file: /tmp/spdk_tgt_config.json.TTJ ===' 00:05:52.479 + echo '' 00:05:52.479 + rm /tmp/62.m7a /tmp/spdk_tgt_config.json.TTJ 00:05:52.479 + exit 1 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:05:52.479 INFO: configuration change detected. 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:05:52.479 13:34:06 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:52.479 13:34:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@317 -- # [[ -n 1441364 ]] 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:05:52.479 13:34:06 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:52.479 13:34:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:05:52.479 13:34:06 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:05:52.479 13:34:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:05:52.740 13:34:06 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:05:52.740 13:34:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:05:52.740 13:34:07 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:05:52.740 13:34:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:05:53.000 13:34:07 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:05:53.000 13:34:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:05:53.261 13:34:07 json_config -- json_config/json_config.sh@193 -- # uname -s 00:05:53.261 13:34:07 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:05:53.261 13:34:07 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:05:53.261 13:34:07 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:05:53.261 13:34:07 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:53.261 13:34:07 json_config -- json_config/json_config.sh@323 -- # killprocess 1441364 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@949 -- # '[' -z 1441364 ']' 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@953 -- # kill -0 1441364 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@954 -- # uname 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1441364 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1441364' 00:05:53.261 killing process with pid 1441364 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@968 -- # kill 1441364 00:05:53.261 13:34:07 json_config -- common/autotest_common.sh@973 -- # wait 1441364 00:05:55.810 13:34:09 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:55.810 13:34:09 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:05:55.810 13:34:09 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:55.810 13:34:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:55.810 13:34:09 json_config -- json_config/json_config.sh@328 -- # return 0 00:05:55.810 13:34:09 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:05:55.810 INFO: Success 00:05:55.810 00:05:55.810 real 0m20.337s 00:05:55.810 user 0m25.341s 00:05:55.810 sys 0m2.652s 00:05:55.810 13:34:09 json_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:55.810 13:34:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:55.810 ************************************ 00:05:55.810 END TEST json_config 00:05:55.810 ************************************ 00:05:55.810 13:34:09 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:55.810 13:34:09 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:55.810 13:34:09 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:55.810 13:34:09 -- common/autotest_common.sh@10 -- # set +x 00:05:55.810 ************************************ 00:05:55.810 START TEST json_config_extra_key 00:05:55.810 ************************************ 00:05:55.810 13:34:09 json_config_extra_key -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:55.810 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:806f5428-4aec-ec11-9bc7-a4bf01928306 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=806f5428-4aec-ec11-9bc7-a4bf01928306 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:55.810 13:34:09 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:55.811 13:34:09 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:55.811 13:34:09 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:55.811 13:34:09 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:55.811 13:34:09 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:55.811 13:34:09 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:55.811 13:34:09 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:55.811 13:34:09 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:05:55.811 13:34:09 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:55.811 13:34:09 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:55.811 INFO: launching applications... 00:05:55.811 13:34:09 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1442814 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:55.811 Waiting for target to run... 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1442814 /var/tmp/spdk_tgt.sock 00:05:55.811 13:34:09 json_config_extra_key -- common/autotest_common.sh@830 -- # '[' -z 1442814 ']' 00:05:55.811 13:34:09 json_config_extra_key -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:55.811 13:34:09 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:55.811 13:34:09 json_config_extra_key -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:55.811 13:34:09 json_config_extra_key -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:55.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:55.811 13:34:09 json_config_extra_key -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:55.811 13:34:09 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:55.811 [2024-06-10 13:34:09.995788] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:55.811 [2024-06-10 13:34:09.995837] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1442814 ] 00:05:56.073 [2024-06-10 13:34:10.308399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.073 [2024-06-10 13:34:10.360197] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.645 13:34:10 json_config_extra_key -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:56.645 13:34:10 json_config_extra_key -- common/autotest_common.sh@863 -- # return 0 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:05:56.645 00:05:56.645 13:34:10 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:56.645 INFO: shutting down applications... 00:05:56.645 13:34:10 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1442814 ]] 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1442814 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1442814 00:05:56.645 13:34:10 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1442814 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@43 -- # break 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:56.906 13:34:11 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:56.906 SPDK target shutdown done 00:05:56.906 13:34:11 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:56.906 Success 00:05:56.906 00:05:56.906 real 0m1.521s 00:05:56.906 user 0m1.140s 00:05:56.906 sys 0m0.380s 00:05:56.906 13:34:11 json_config_extra_key -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:56.906 13:34:11 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:56.906 ************************************ 00:05:56.906 END TEST json_config_extra_key 00:05:56.906 ************************************ 00:05:57.168 13:34:11 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:57.168 13:34:11 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:57.168 13:34:11 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:57.168 13:34:11 -- common/autotest_common.sh@10 -- # set +x 00:05:57.168 ************************************ 00:05:57.168 START TEST alias_rpc 00:05:57.168 ************************************ 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:57.168 * Looking for test storage... 00:05:57.168 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:05:57.168 13:34:11 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:57.168 13:34:11 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1443188 00:05:57.168 13:34:11 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1443188 00:05:57.168 13:34:11 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@830 -- # '[' -z 1443188 ']' 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:57.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:57.168 13:34:11 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:57.168 [2024-06-10 13:34:11.588368] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:57.168 [2024-06-10 13:34:11.588431] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1443188 ] 00:05:57.429 [2024-06-10 13:34:11.681349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.429 [2024-06-10 13:34:11.751722] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.000 13:34:12 alias_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:58.000 13:34:12 alias_rpc -- common/autotest_common.sh@863 -- # return 0 00:05:58.000 13:34:12 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:58.260 13:34:12 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1443188 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@949 -- # '[' -z 1443188 ']' 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@953 -- # kill -0 1443188 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@954 -- # uname 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1443188 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1443188' 00:05:58.260 killing process with pid 1443188 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@968 -- # kill 1443188 00:05:58.260 13:34:12 alias_rpc -- common/autotest_common.sh@973 -- # wait 1443188 00:05:58.521 00:05:58.521 real 0m1.509s 00:05:58.521 user 0m1.741s 00:05:58.521 sys 0m0.396s 00:05:58.521 13:34:12 alias_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:58.521 13:34:12 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:58.521 ************************************ 00:05:58.521 END TEST alias_rpc 00:05:58.521 ************************************ 00:05:58.521 13:34:12 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:05:58.521 13:34:12 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:58.521 13:34:12 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:58.521 13:34:12 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:58.521 13:34:12 -- common/autotest_common.sh@10 -- # set +x 00:05:58.781 ************************************ 00:05:58.781 START TEST spdkcli_tcp 00:05:58.781 ************************************ 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:58.781 * Looking for test storage... 00:05:58.781 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1443577 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1443577 00:05:58.781 13:34:13 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@830 -- # '[' -z 1443577 ']' 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:58.781 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:58.781 13:34:13 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:58.781 [2024-06-10 13:34:13.187558] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:05:58.781 [2024-06-10 13:34:13.187625] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1443577 ] 00:05:59.042 [2024-06-10 13:34:13.280335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:59.042 [2024-06-10 13:34:13.351592] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:05:59.042 [2024-06-10 13:34:13.351598] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.614 13:34:14 spdkcli_tcp -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:59.614 13:34:14 spdkcli_tcp -- common/autotest_common.sh@863 -- # return 0 00:05:59.614 13:34:14 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1443900 00:05:59.614 13:34:14 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:59.614 13:34:14 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:59.874 [ 00:05:59.874 "bdev_malloc_delete", 00:05:59.874 "bdev_malloc_create", 00:05:59.874 "bdev_null_resize", 00:05:59.874 "bdev_null_delete", 00:05:59.874 "bdev_null_create", 00:05:59.874 "bdev_nvme_cuse_unregister", 00:05:59.874 "bdev_nvme_cuse_register", 00:05:59.874 "bdev_opal_new_user", 00:05:59.874 "bdev_opal_set_lock_state", 00:05:59.874 "bdev_opal_delete", 00:05:59.874 "bdev_opal_get_info", 00:05:59.874 "bdev_opal_create", 00:05:59.874 "bdev_nvme_opal_revert", 00:05:59.874 "bdev_nvme_opal_init", 00:05:59.874 "bdev_nvme_send_cmd", 00:05:59.874 "bdev_nvme_get_path_iostat", 00:05:59.874 "bdev_nvme_get_mdns_discovery_info", 00:05:59.874 "bdev_nvme_stop_mdns_discovery", 00:05:59.874 "bdev_nvme_start_mdns_discovery", 00:05:59.874 "bdev_nvme_set_multipath_policy", 00:05:59.874 "bdev_nvme_set_preferred_path", 00:05:59.874 "bdev_nvme_get_io_paths", 00:05:59.874 "bdev_nvme_remove_error_injection", 00:05:59.874 "bdev_nvme_add_error_injection", 00:05:59.874 "bdev_nvme_get_discovery_info", 00:05:59.874 "bdev_nvme_stop_discovery", 00:05:59.874 "bdev_nvme_start_discovery", 00:05:59.874 "bdev_nvme_get_controller_health_info", 00:05:59.874 "bdev_nvme_disable_controller", 00:05:59.874 "bdev_nvme_enable_controller", 00:05:59.874 "bdev_nvme_reset_controller", 00:05:59.874 "bdev_nvme_get_transport_statistics", 00:05:59.874 "bdev_nvme_apply_firmware", 00:05:59.874 "bdev_nvme_detach_controller", 00:05:59.874 "bdev_nvme_get_controllers", 00:05:59.874 "bdev_nvme_attach_controller", 00:05:59.874 "bdev_nvme_set_hotplug", 00:05:59.874 "bdev_nvme_set_options", 00:05:59.874 "bdev_passthru_delete", 00:05:59.874 "bdev_passthru_create", 00:05:59.874 "bdev_lvol_set_parent_bdev", 00:05:59.874 "bdev_lvol_set_parent", 00:05:59.874 "bdev_lvol_check_shallow_copy", 00:05:59.874 "bdev_lvol_start_shallow_copy", 00:05:59.874 "bdev_lvol_grow_lvstore", 00:05:59.874 "bdev_lvol_get_lvols", 00:05:59.874 "bdev_lvol_get_lvstores", 00:05:59.874 "bdev_lvol_delete", 00:05:59.874 "bdev_lvol_set_read_only", 00:05:59.874 "bdev_lvol_resize", 00:05:59.874 "bdev_lvol_decouple_parent", 00:05:59.874 "bdev_lvol_inflate", 00:05:59.874 "bdev_lvol_rename", 00:05:59.875 "bdev_lvol_clone_bdev", 00:05:59.875 "bdev_lvol_clone", 00:05:59.875 "bdev_lvol_snapshot", 00:05:59.875 "bdev_lvol_create", 00:05:59.875 "bdev_lvol_delete_lvstore", 00:05:59.875 "bdev_lvol_rename_lvstore", 00:05:59.875 "bdev_lvol_create_lvstore", 00:05:59.875 "bdev_raid_set_options", 00:05:59.875 "bdev_raid_remove_base_bdev", 00:05:59.875 "bdev_raid_add_base_bdev", 00:05:59.875 "bdev_raid_delete", 00:05:59.875 "bdev_raid_create", 00:05:59.875 "bdev_raid_get_bdevs", 00:05:59.875 "bdev_error_inject_error", 00:05:59.875 "bdev_error_delete", 00:05:59.875 "bdev_error_create", 00:05:59.875 "bdev_split_delete", 00:05:59.875 "bdev_split_create", 00:05:59.875 "bdev_delay_delete", 00:05:59.875 "bdev_delay_create", 00:05:59.875 "bdev_delay_update_latency", 00:05:59.875 "bdev_zone_block_delete", 00:05:59.875 "bdev_zone_block_create", 00:05:59.875 "blobfs_create", 00:05:59.875 "blobfs_detect", 00:05:59.875 "blobfs_set_cache_size", 00:05:59.875 "bdev_crypto_delete", 00:05:59.875 "bdev_crypto_create", 00:05:59.875 "bdev_compress_delete", 00:05:59.875 "bdev_compress_create", 00:05:59.875 "bdev_compress_get_orphans", 00:05:59.875 "bdev_aio_delete", 00:05:59.875 "bdev_aio_rescan", 00:05:59.875 "bdev_aio_create", 00:05:59.875 "bdev_ftl_set_property", 00:05:59.875 "bdev_ftl_get_properties", 00:05:59.875 "bdev_ftl_get_stats", 00:05:59.875 "bdev_ftl_unmap", 00:05:59.875 "bdev_ftl_unload", 00:05:59.875 "bdev_ftl_delete", 00:05:59.875 "bdev_ftl_load", 00:05:59.875 "bdev_ftl_create", 00:05:59.875 "bdev_virtio_attach_controller", 00:05:59.875 "bdev_virtio_scsi_get_devices", 00:05:59.875 "bdev_virtio_detach_controller", 00:05:59.875 "bdev_virtio_blk_set_hotplug", 00:05:59.875 "bdev_iscsi_delete", 00:05:59.875 "bdev_iscsi_create", 00:05:59.875 "bdev_iscsi_set_options", 00:05:59.875 "accel_error_inject_error", 00:05:59.875 "ioat_scan_accel_module", 00:05:59.875 "dsa_scan_accel_module", 00:05:59.875 "iaa_scan_accel_module", 00:05:59.875 "dpdk_cryptodev_get_driver", 00:05:59.875 "dpdk_cryptodev_set_driver", 00:05:59.875 "dpdk_cryptodev_scan_accel_module", 00:05:59.875 "compressdev_scan_accel_module", 00:05:59.875 "keyring_file_remove_key", 00:05:59.875 "keyring_file_add_key", 00:05:59.875 "keyring_linux_set_options", 00:05:59.875 "iscsi_get_histogram", 00:05:59.875 "iscsi_enable_histogram", 00:05:59.875 "iscsi_set_options", 00:05:59.875 "iscsi_get_auth_groups", 00:05:59.875 "iscsi_auth_group_remove_secret", 00:05:59.875 "iscsi_auth_group_add_secret", 00:05:59.875 "iscsi_delete_auth_group", 00:05:59.875 "iscsi_create_auth_group", 00:05:59.875 "iscsi_set_discovery_auth", 00:05:59.875 "iscsi_get_options", 00:05:59.875 "iscsi_target_node_request_logout", 00:05:59.875 "iscsi_target_node_set_redirect", 00:05:59.875 "iscsi_target_node_set_auth", 00:05:59.875 "iscsi_target_node_add_lun", 00:05:59.875 "iscsi_get_stats", 00:05:59.875 "iscsi_get_connections", 00:05:59.875 "iscsi_portal_group_set_auth", 00:05:59.875 "iscsi_start_portal_group", 00:05:59.875 "iscsi_delete_portal_group", 00:05:59.875 "iscsi_create_portal_group", 00:05:59.875 "iscsi_get_portal_groups", 00:05:59.875 "iscsi_delete_target_node", 00:05:59.875 "iscsi_target_node_remove_pg_ig_maps", 00:05:59.875 "iscsi_target_node_add_pg_ig_maps", 00:05:59.875 "iscsi_create_target_node", 00:05:59.875 "iscsi_get_target_nodes", 00:05:59.875 "iscsi_delete_initiator_group", 00:05:59.875 "iscsi_initiator_group_remove_initiators", 00:05:59.875 "iscsi_initiator_group_add_initiators", 00:05:59.875 "iscsi_create_initiator_group", 00:05:59.875 "iscsi_get_initiator_groups", 00:05:59.875 "nvmf_set_crdt", 00:05:59.875 "nvmf_set_config", 00:05:59.875 "nvmf_set_max_subsystems", 00:05:59.875 "nvmf_stop_mdns_prr", 00:05:59.875 "nvmf_publish_mdns_prr", 00:05:59.875 "nvmf_subsystem_get_listeners", 00:05:59.875 "nvmf_subsystem_get_qpairs", 00:05:59.875 "nvmf_subsystem_get_controllers", 00:05:59.875 "nvmf_get_stats", 00:05:59.875 "nvmf_get_transports", 00:05:59.875 "nvmf_create_transport", 00:05:59.875 "nvmf_get_targets", 00:05:59.875 "nvmf_delete_target", 00:05:59.875 "nvmf_create_target", 00:05:59.875 "nvmf_subsystem_allow_any_host", 00:05:59.875 "nvmf_subsystem_remove_host", 00:05:59.875 "nvmf_subsystem_add_host", 00:05:59.875 "nvmf_ns_remove_host", 00:05:59.875 "nvmf_ns_add_host", 00:05:59.875 "nvmf_subsystem_remove_ns", 00:05:59.875 "nvmf_subsystem_add_ns", 00:05:59.875 "nvmf_subsystem_listener_set_ana_state", 00:05:59.875 "nvmf_discovery_get_referrals", 00:05:59.875 "nvmf_discovery_remove_referral", 00:05:59.875 "nvmf_discovery_add_referral", 00:05:59.875 "nvmf_subsystem_remove_listener", 00:05:59.875 "nvmf_subsystem_add_listener", 00:05:59.875 "nvmf_delete_subsystem", 00:05:59.875 "nvmf_create_subsystem", 00:05:59.875 "nvmf_get_subsystems", 00:05:59.875 "env_dpdk_get_mem_stats", 00:05:59.875 "nbd_get_disks", 00:05:59.875 "nbd_stop_disk", 00:05:59.875 "nbd_start_disk", 00:05:59.875 "ublk_recover_disk", 00:05:59.875 "ublk_get_disks", 00:05:59.875 "ublk_stop_disk", 00:05:59.875 "ublk_start_disk", 00:05:59.875 "ublk_destroy_target", 00:05:59.875 "ublk_create_target", 00:05:59.875 "virtio_blk_create_transport", 00:05:59.875 "virtio_blk_get_transports", 00:05:59.875 "vhost_controller_set_coalescing", 00:05:59.875 "vhost_get_controllers", 00:05:59.875 "vhost_delete_controller", 00:05:59.875 "vhost_create_blk_controller", 00:05:59.875 "vhost_scsi_controller_remove_target", 00:05:59.875 "vhost_scsi_controller_add_target", 00:05:59.875 "vhost_start_scsi_controller", 00:05:59.875 "vhost_create_scsi_controller", 00:05:59.875 "thread_set_cpumask", 00:05:59.875 "framework_get_scheduler", 00:05:59.875 "framework_set_scheduler", 00:05:59.875 "framework_get_reactors", 00:05:59.875 "thread_get_io_channels", 00:05:59.875 "thread_get_pollers", 00:05:59.875 "thread_get_stats", 00:05:59.875 "framework_monitor_context_switch", 00:05:59.875 "spdk_kill_instance", 00:05:59.875 "log_enable_timestamps", 00:05:59.875 "log_get_flags", 00:05:59.875 "log_clear_flag", 00:05:59.875 "log_set_flag", 00:05:59.875 "log_get_level", 00:05:59.875 "log_set_level", 00:05:59.875 "log_get_print_level", 00:05:59.875 "log_set_print_level", 00:05:59.875 "framework_enable_cpumask_locks", 00:05:59.875 "framework_disable_cpumask_locks", 00:05:59.875 "framework_wait_init", 00:05:59.875 "framework_start_init", 00:05:59.875 "scsi_get_devices", 00:05:59.875 "bdev_get_histogram", 00:05:59.875 "bdev_enable_histogram", 00:05:59.875 "bdev_set_qos_limit", 00:05:59.875 "bdev_set_qd_sampling_period", 00:05:59.875 "bdev_get_bdevs", 00:05:59.875 "bdev_reset_iostat", 00:05:59.875 "bdev_get_iostat", 00:05:59.875 "bdev_examine", 00:05:59.875 "bdev_wait_for_examine", 00:05:59.875 "bdev_set_options", 00:05:59.875 "notify_get_notifications", 00:05:59.875 "notify_get_types", 00:05:59.875 "accel_get_stats", 00:05:59.875 "accel_set_options", 00:05:59.875 "accel_set_driver", 00:05:59.875 "accel_crypto_key_destroy", 00:05:59.875 "accel_crypto_keys_get", 00:05:59.875 "accel_crypto_key_create", 00:05:59.875 "accel_assign_opc", 00:05:59.875 "accel_get_module_info", 00:05:59.875 "accel_get_opc_assignments", 00:05:59.876 "vmd_rescan", 00:05:59.876 "vmd_remove_device", 00:05:59.876 "vmd_enable", 00:05:59.876 "sock_get_default_impl", 00:05:59.876 "sock_set_default_impl", 00:05:59.876 "sock_impl_set_options", 00:05:59.876 "sock_impl_get_options", 00:05:59.876 "iobuf_get_stats", 00:05:59.876 "iobuf_set_options", 00:05:59.876 "framework_get_pci_devices", 00:05:59.876 "framework_get_config", 00:05:59.876 "framework_get_subsystems", 00:05:59.876 "trace_get_info", 00:05:59.876 "trace_get_tpoint_group_mask", 00:05:59.876 "trace_disable_tpoint_group", 00:05:59.876 "trace_enable_tpoint_group", 00:05:59.876 "trace_clear_tpoint_mask", 00:05:59.876 "trace_set_tpoint_mask", 00:05:59.876 "keyring_get_keys", 00:05:59.876 "spdk_get_version", 00:05:59.876 "rpc_get_methods" 00:05:59.876 ] 00:05:59.876 13:34:14 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:59.876 13:34:14 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:59.876 13:34:14 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1443577 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@949 -- # '[' -z 1443577 ']' 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@953 -- # kill -0 1443577 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@954 -- # uname 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1443577 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1443577' 00:05:59.876 killing process with pid 1443577 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@968 -- # kill 1443577 00:05:59.876 13:34:14 spdkcli_tcp -- common/autotest_common.sh@973 -- # wait 1443577 00:06:00.137 00:06:00.137 real 0m1.529s 00:06:00.137 user 0m2.866s 00:06:00.137 sys 0m0.453s 00:06:00.137 13:34:14 spdkcli_tcp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:00.137 13:34:14 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:00.137 ************************************ 00:06:00.137 END TEST spdkcli_tcp 00:06:00.137 ************************************ 00:06:00.137 13:34:14 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:00.137 13:34:14 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:00.137 13:34:14 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:00.137 13:34:14 -- common/autotest_common.sh@10 -- # set +x 00:06:00.137 ************************************ 00:06:00.137 START TEST dpdk_mem_utility 00:06:00.137 ************************************ 00:06:00.137 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:00.398 * Looking for test storage... 00:06:00.398 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:00.398 13:34:14 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:00.398 13:34:14 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1443973 00:06:00.398 13:34:14 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1443973 00:06:00.398 13:34:14 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:00.398 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@830 -- # '[' -z 1443973 ']' 00:06:00.398 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:00.398 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:00.398 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:00.398 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:00.398 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:00.398 13:34:14 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:00.398 [2024-06-10 13:34:14.768591] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:00.398 [2024-06-10 13:34:14.768656] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1443973 ] 00:06:00.398 [2024-06-10 13:34:14.863692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.659 [2024-06-10 13:34:14.933101] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.230 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:01.230 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@863 -- # return 0 00:06:01.230 13:34:15 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:01.230 13:34:15 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:01.230 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:01.230 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:01.230 { 00:06:01.230 "filename": "/tmp/spdk_mem_dump.txt" 00:06:01.230 } 00:06:01.230 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:01.230 13:34:15 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:01.496 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:01.496 1 heaps totaling size 814.000000 MiB 00:06:01.496 size: 814.000000 MiB heap id: 0 00:06:01.496 end heaps---------- 00:06:01.496 8 mempools totaling size 598.116089 MiB 00:06:01.496 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:01.496 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:01.496 size: 84.521057 MiB name: bdev_io_1443973 00:06:01.496 size: 51.011292 MiB name: evtpool_1443973 00:06:01.496 size: 50.003479 MiB name: msgpool_1443973 00:06:01.496 size: 21.763794 MiB name: PDU_Pool 00:06:01.496 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:01.496 size: 0.026123 MiB name: Session_Pool 00:06:01.496 end mempools------- 00:06:01.496 201 memzones totaling size 4.176453 MiB 00:06:01.496 size: 1.000366 MiB name: RG_ring_0_1443973 00:06:01.496 size: 1.000366 MiB name: RG_ring_1_1443973 00:06:01.496 size: 1.000366 MiB name: RG_ring_4_1443973 00:06:01.496 size: 1.000366 MiB name: RG_ring_5_1443973 00:06:01.496 size: 0.125366 MiB name: RG_ring_2_1443973 00:06:01.496 size: 0.015991 MiB name: RG_ring_3_1443973 00:06:01.496 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.0_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.1_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.2_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.3_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.4_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.5_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.6_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:01.7_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.0_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.1_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.2_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.3_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.4_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.5_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.6_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4d:02.7_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.0_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.1_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.2_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.3_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.4_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.5_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.6_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:01.7_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:02.0_qat 00:06:01.496 size: 0.000305 MiB name: 0000:4f:02.1_qat 00:06:01.497 size: 0.000305 MiB name: 0000:4f:02.2_qat 00:06:01.497 size: 0.000305 MiB name: 0000:4f:02.3_qat 00:06:01.497 size: 0.000305 MiB name: 0000:4f:02.4_qat 00:06:01.497 size: 0.000305 MiB name: 0000:4f:02.5_qat 00:06:01.497 size: 0.000305 MiB name: 0000:4f:02.6_qat 00:06:01.497 size: 0.000305 MiB name: 0000:4f:02.7_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.0_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.1_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.2_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.3_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.4_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.5_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.6_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:01.7_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.0_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.1_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.2_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.3_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.4_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.5_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.6_qat 00:06:01.497 size: 0.000305 MiB name: 0000:51:02.7_qat 00:06:01.497 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:01.497 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:01.497 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:01.498 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:01.498 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:01.498 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:01.498 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:01.498 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:01.498 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:01.498 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:01.498 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:01.498 end memzones------- 00:06:01.498 13:34:15 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:01.498 heap id: 0 total size: 814.000000 MiB number of busy elements: 669 number of free elements: 14 00:06:01.498 list of free elements. size: 11.775574 MiB 00:06:01.498 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:01.498 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:01.498 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:01.498 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:01.498 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:01.498 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:01.498 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:01.498 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:01.498 element at address: 0x20001aa00000 with size: 0.563660 MiB 00:06:01.498 element at address: 0x200003a00000 with size: 0.490173 MiB 00:06:01.498 element at address: 0x20000b200000 with size: 0.488892 MiB 00:06:01.498 element at address: 0x200000800000 with size: 0.486145 MiB 00:06:01.498 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:01.498 element at address: 0x200027e00000 with size: 0.395752 MiB 00:06:01.498 list of standard malloc elements. size: 199.904419 MiB 00:06:01.498 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:01.498 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:01.498 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:01.498 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:01.498 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:01.498 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:01.498 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:01.498 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:01.498 element at address: 0x20000032bc80 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000032f740 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000333200 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000336cc0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000033a780 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000033e240 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000341d00 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003457c0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000349280 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000034cd40 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000350800 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003542c0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000357d80 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000035b840 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000035f300 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:01.498 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:01.498 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:01.498 element at address: 0x200000329b80 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000032ac00 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000032d640 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000032e6c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000331100 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000332180 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000334bc0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000335c40 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000338680 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000339700 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000033c140 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000033d1c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000033fc00 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000340c80 with size: 0.004028 MiB 00:06:01.498 element at address: 0x2000003436c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000344740 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000347180 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000348200 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000034ac40 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000034bcc0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000034e700 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000034f780 with size: 0.004028 MiB 00:06:01.498 element at address: 0x2000003521c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000353240 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000355c80 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000356d00 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000359740 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000035a7c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000035d200 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000035e280 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:01.498 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:01.498 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:01.498 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:01.498 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:01.498 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:01.498 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:01.499 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:01.499 element at address: 0x200000200000 with size: 0.000244 MiB 00:06:01.499 element at address: 0x200000204300 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002245c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224680 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224740 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224800 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002248c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224980 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224a40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224b00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224bc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224c80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224d40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224e00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224ec0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000224f80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225040 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225100 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002251c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225280 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225340 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225400 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002254c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225580 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225640 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225700 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002257c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225880 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225940 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225a00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225b80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225d80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225e40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225f00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000225fc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226080 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226140 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226200 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002262c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226380 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226440 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226500 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002265c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226680 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226740 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226800 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000002268c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226980 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226a40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226b00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226c80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226d40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226e00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000226f80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000227040 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000227100 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000329300 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003293c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000329580 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000329640 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000329800 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000032ce80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000032d040 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000032d100 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000032d2c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000330940 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000330b00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000330bc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000330d80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000334400 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003345c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000334680 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000334840 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000337ec0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000338080 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000338140 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000338300 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033b980 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033bb40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033bc00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033f440 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033f600 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033f6c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000033f880 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000342f00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003430c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000343180 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000343340 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003469c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000346b80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000346c40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000346e00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034a480 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034a640 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034a700 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034a8c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034df40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034e100 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034e1c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000034e380 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000351a00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000351bc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000351c80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000351e40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003554c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000355680 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000355740 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000355900 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000358f80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000359140 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000359200 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003593c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000035ca40 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000035cc00 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000035ccc0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x20000035ce80 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000360500 with size: 0.000183 MiB 00:06:01.499 element at address: 0x2000003606c0 with size: 0.000183 MiB 00:06:01.499 element at address: 0x200000360780 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000360940 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087c740 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087c800 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7d7c0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7d880 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7d940 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7da00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7dac0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7db80 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7dc40 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7dd00 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7ddc0 with size: 0.000183 MiB 00:06:01.500 element at address: 0x200003a7de80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7df40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e000 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e0c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e180 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e240 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e300 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e3c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e480 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e540 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e600 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e6c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e780 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e840 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e900 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7e9c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7ea80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7eb40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7ec00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7ecc0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7ed80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7ee40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7ef00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7efc0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f080 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f140 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f200 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f2c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f380 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f440 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f500 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f5c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f680 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003a7f740 with size: 0.000183 MiB 00:06:01.501 element at address: 0x200003affa00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d280 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa904c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90580 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90640 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90700 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa907c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90880 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:06:01.501 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:01.502 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e65500 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:01.502 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:01.502 list of memzone associated elements. size: 602.320007 MiB 00:06:01.502 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:01.502 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:01.502 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:01.502 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:01.502 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:01.502 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1443973_0 00:06:01.502 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:01.502 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1443973_0 00:06:01.502 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:01.502 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1443973_0 00:06:01.502 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:01.502 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:01.502 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:01.502 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:01.502 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:01.502 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1443973 00:06:01.502 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:01.502 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1443973 00:06:01.502 element at address: 0x2000002271c0 with size: 1.008118 MiB 00:06:01.502 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1443973 00:06:01.502 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:01.502 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:01.502 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:01.502 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:01.503 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:01.503 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:01.503 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:01.503 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:01.503 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:01.503 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1443973 00:06:01.503 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:01.503 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1443973 00:06:01.503 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:01.503 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1443973 00:06:01.503 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:01.503 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1443973 00:06:01.503 element at address: 0x200003a7f800 with size: 0.500488 MiB 00:06:01.503 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1443973 00:06:01.503 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:01.503 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:01.503 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:01.503 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:01.503 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:01.503 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:01.503 element at address: 0x2000002043c0 with size: 0.125488 MiB 00:06:01.503 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1443973 00:06:01.503 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:01.503 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:01.503 element at address: 0x200027e65680 with size: 0.023743 MiB 00:06:01.503 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:01.503 element at address: 0x200000200100 with size: 0.016113 MiB 00:06:01.503 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1443973 00:06:01.503 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:06:01.503 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:01.503 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:01.503 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:01.503 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.0_qat 00:06:01.503 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.1_qat 00:06:01.503 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.2_qat 00:06:01.503 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.3_qat 00:06:01.503 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.4_qat 00:06:01.503 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.5_qat 00:06:01.503 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.6_qat 00:06:01.503 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:01.7_qat 00:06:01.503 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.0_qat 00:06:01.503 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.1_qat 00:06:01.503 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.2_qat 00:06:01.503 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.3_qat 00:06:01.503 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.4_qat 00:06:01.503 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.5_qat 00:06:01.503 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.6_qat 00:06:01.503 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4d:02.7_qat 00:06:01.503 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.0_qat 00:06:01.503 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.1_qat 00:06:01.503 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.2_qat 00:06:01.503 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.3_qat 00:06:01.503 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.4_qat 00:06:01.503 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.5_qat 00:06:01.503 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.6_qat 00:06:01.503 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:01.7_qat 00:06:01.503 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.0_qat 00:06:01.503 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.1_qat 00:06:01.503 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.2_qat 00:06:01.503 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.3_qat 00:06:01.503 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.4_qat 00:06:01.503 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.5_qat 00:06:01.503 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.6_qat 00:06:01.503 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:4f:02.7_qat 00:06:01.503 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.0_qat 00:06:01.503 element at address: 0x20000035d040 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.1_qat 00:06:01.503 element at address: 0x200000359580 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.2_qat 00:06:01.503 element at address: 0x200000355ac0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.3_qat 00:06:01.503 element at address: 0x200000352000 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.4_qat 00:06:01.503 element at address: 0x20000034e540 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.5_qat 00:06:01.503 element at address: 0x20000034aa80 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.6_qat 00:06:01.503 element at address: 0x200000346fc0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:01.7_qat 00:06:01.503 element at address: 0x200000343500 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.0_qat 00:06:01.503 element at address: 0x20000033fa40 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.1_qat 00:06:01.503 element at address: 0x20000033bf80 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.2_qat 00:06:01.503 element at address: 0x2000003384c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.3_qat 00:06:01.503 element at address: 0x200000334a00 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.4_qat 00:06:01.503 element at address: 0x200000330f40 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.5_qat 00:06:01.503 element at address: 0x20000032d480 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.6_qat 00:06:01.503 element at address: 0x2000003299c0 with size: 0.000427 MiB 00:06:01.503 associated memzone info: size: 0.000305 MiB name: 0000:51:02.7_qat 00:06:01.503 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:01.503 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:01.503 element at address: 0x200000225c40 with size: 0.000305 MiB 00:06:01.503 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1443973 00:06:01.503 element at address: 0x200003affac0 with size: 0.000305 MiB 00:06:01.503 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1443973 00:06:01.503 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:06:01.503 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:01.503 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:01.503 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:01.503 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:01.503 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:01.503 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:01.503 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:01.503 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:01.503 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:01.503 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:01.504 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:01.504 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:01.504 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:01.504 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:01.504 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:01.504 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:01.504 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:01.504 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:01.504 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:01.504 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:01.504 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:01.504 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:01.504 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:01.504 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:01.504 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:01.504 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:01.504 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:01.504 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:01.504 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:01.504 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:01.504 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:01.504 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:01.504 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:01.504 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:01.504 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:01.504 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:01.504 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:01.504 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:01.504 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:01.504 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:01.504 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:01.504 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:01.504 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:01.504 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:01.504 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:01.504 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:01.504 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:01.504 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:01.504 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:01.504 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:01.504 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:01.504 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:01.504 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:01.504 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:01.504 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:01.504 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:01.504 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:01.504 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:01.504 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:01.504 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:01.504 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:01.504 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:01.504 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:01.504 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:01.504 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:01.504 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:01.504 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:01.504 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:01.504 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:01.504 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:01.504 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:01.504 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:01.504 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:01.504 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:01.504 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:01.504 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:01.504 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:01.504 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:01.504 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:01.504 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:01.504 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:01.504 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:01.505 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:01.505 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:01.505 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:01.505 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:01.505 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:01.505 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:01.505 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:01.505 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:01.505 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:01.505 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:01.505 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:01.505 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:01.505 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:01.505 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:01.505 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:01.505 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:01.505 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:01.505 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:01.505 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:01.505 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:01.505 element at address: 0x200000360a00 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:01.505 element at address: 0x200000360840 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:01.505 element at address: 0x2000003605c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:06:01.505 element at address: 0x20000035cf40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:01.505 element at address: 0x20000035cd80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:01.505 element at address: 0x20000035cb00 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:06:01.505 element at address: 0x200000359480 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:01.505 element at address: 0x2000003592c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:01.505 element at address: 0x200000359040 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:06:01.505 element at address: 0x2000003559c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:01.505 element at address: 0x200000355800 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:01.505 element at address: 0x200000355580 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:06:01.505 element at address: 0x200000351f00 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:01.505 element at address: 0x200000351d40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:01.505 element at address: 0x200000351ac0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:06:01.505 element at address: 0x20000034e440 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:01.505 element at address: 0x20000034e280 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:01.505 element at address: 0x20000034e000 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:06:01.505 element at address: 0x20000034a980 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:01.505 element at address: 0x20000034a7c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:01.505 element at address: 0x20000034a540 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:06:01.505 element at address: 0x200000346ec0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:01.505 element at address: 0x200000346d00 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:01.505 element at address: 0x200000346a80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:06:01.505 element at address: 0x200000343400 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:01.505 element at address: 0x200000343240 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:01.505 element at address: 0x200000342fc0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:06:01.505 element at address: 0x20000033f940 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:01.505 element at address: 0x20000033f780 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:01.505 element at address: 0x20000033f500 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:06:01.505 element at address: 0x20000033be80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:01.505 element at address: 0x20000033bcc0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:01.505 element at address: 0x20000033ba40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:06:01.505 element at address: 0x2000003383c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:01.505 element at address: 0x200000338200 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:01.505 element at address: 0x200000337f80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:06:01.505 element at address: 0x200000334900 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:01.505 element at address: 0x200000334740 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:01.505 element at address: 0x2000003344c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:06:01.505 element at address: 0x200000330e40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:01.505 element at address: 0x200000330c80 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:01.505 element at address: 0x200000330a00 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:06:01.505 element at address: 0x20000032d380 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:01.505 element at address: 0x20000032d1c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:01.505 element at address: 0x20000032cf40 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:06:01.505 element at address: 0x2000003298c0 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:01.505 element at address: 0x200000329700 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:01.505 element at address: 0x200000329480 with size: 0.000244 MiB 00:06:01.505 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:06:01.505 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:01.505 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:01.505 13:34:15 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:01.505 13:34:15 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1443973 00:06:01.505 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@949 -- # '[' -z 1443973 ']' 00:06:01.505 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@953 -- # kill -0 1443973 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@954 -- # uname 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1443973 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1443973' 00:06:01.506 killing process with pid 1443973 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@968 -- # kill 1443973 00:06:01.506 13:34:15 dpdk_mem_utility -- common/autotest_common.sh@973 -- # wait 1443973 00:06:01.766 00:06:01.766 real 0m1.489s 00:06:01.766 user 0m1.700s 00:06:01.766 sys 0m0.417s 00:06:01.766 13:34:16 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:01.766 13:34:16 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:01.766 ************************************ 00:06:01.766 END TEST dpdk_mem_utility 00:06:01.766 ************************************ 00:06:01.766 13:34:16 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:01.766 13:34:16 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:01.766 13:34:16 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:01.766 13:34:16 -- common/autotest_common.sh@10 -- # set +x 00:06:01.766 ************************************ 00:06:01.766 START TEST event 00:06:01.766 ************************************ 00:06:01.766 13:34:16 event -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:02.027 * Looking for test storage... 00:06:02.027 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:02.027 13:34:16 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:02.027 13:34:16 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:02.027 13:34:16 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:02.027 13:34:16 event -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:02.027 13:34:16 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:02.027 13:34:16 event -- common/autotest_common.sh@10 -- # set +x 00:06:02.027 ************************************ 00:06:02.027 START TEST event_perf 00:06:02.027 ************************************ 00:06:02.027 13:34:16 event.event_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:02.027 Running I/O for 1 seconds...[2024-06-10 13:34:16.319527] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:02.027 [2024-06-10 13:34:16.319612] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1444361 ] 00:06:02.027 [2024-06-10 13:34:16.413242] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:02.027 [2024-06-10 13:34:16.484969] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:02.027 [2024-06-10 13:34:16.485110] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:06:02.027 [2024-06-10 13:34:16.485270] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 3 00:06:02.027 [2024-06-10 13:34:16.485422] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.411 Running I/O for 1 seconds... 00:06:03.411 lcore 0: 165911 00:06:03.411 lcore 1: 165910 00:06:03.411 lcore 2: 165908 00:06:03.411 lcore 3: 165911 00:06:03.411 done. 00:06:03.411 00:06:03.411 real 0m1.246s 00:06:03.411 user 0m4.146s 00:06:03.411 sys 0m0.098s 00:06:03.411 13:34:17 event.event_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:03.411 13:34:17 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:03.411 ************************************ 00:06:03.411 END TEST event_perf 00:06:03.411 ************************************ 00:06:03.411 13:34:17 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:03.411 13:34:17 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:06:03.411 13:34:17 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:03.411 13:34:17 event -- common/autotest_common.sh@10 -- # set +x 00:06:03.411 ************************************ 00:06:03.411 START TEST event_reactor 00:06:03.411 ************************************ 00:06:03.411 13:34:17 event.event_reactor -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:03.411 [2024-06-10 13:34:17.639804] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:03.411 [2024-06-10 13:34:17.639864] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1444706 ] 00:06:03.411 [2024-06-10 13:34:17.730351] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.411 [2024-06-10 13:34:17.795763] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.795 test_start 00:06:04.795 oneshot 00:06:04.795 tick 100 00:06:04.795 tick 100 00:06:04.795 tick 250 00:06:04.795 tick 100 00:06:04.795 tick 100 00:06:04.795 tick 100 00:06:04.795 tick 250 00:06:04.795 tick 500 00:06:04.795 tick 100 00:06:04.795 tick 100 00:06:04.795 tick 250 00:06:04.795 tick 100 00:06:04.795 tick 100 00:06:04.795 test_end 00:06:04.795 00:06:04.795 real 0m1.234s 00:06:04.795 user 0m1.134s 00:06:04.795 sys 0m0.096s 00:06:04.795 13:34:18 event.event_reactor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:04.795 13:34:18 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:04.795 ************************************ 00:06:04.795 END TEST event_reactor 00:06:04.795 ************************************ 00:06:04.795 13:34:18 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:04.795 13:34:18 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:06:04.796 13:34:18 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:04.796 13:34:18 event -- common/autotest_common.sh@10 -- # set +x 00:06:04.796 ************************************ 00:06:04.796 START TEST event_reactor_perf 00:06:04.796 ************************************ 00:06:04.796 13:34:18 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:04.796 [2024-06-10 13:34:18.950189] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:04.796 [2024-06-10 13:34:18.950257] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1445026 ] 00:06:04.796 [2024-06-10 13:34:19.042391] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.796 [2024-06-10 13:34:19.117107] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.736 test_start 00:06:05.736 test_end 00:06:05.736 Performance: 364289 events per second 00:06:05.736 00:06:05.736 real 0m1.248s 00:06:05.736 user 0m1.143s 00:06:05.736 sys 0m0.101s 00:06:05.736 13:34:20 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:05.736 13:34:20 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:05.736 ************************************ 00:06:05.736 END TEST event_reactor_perf 00:06:05.736 ************************************ 00:06:05.996 13:34:20 event -- event/event.sh@49 -- # uname -s 00:06:05.996 13:34:20 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:05.996 13:34:20 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:05.997 13:34:20 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:05.997 13:34:20 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:05.997 13:34:20 event -- common/autotest_common.sh@10 -- # set +x 00:06:05.997 ************************************ 00:06:05.997 START TEST event_scheduler 00:06:05.997 ************************************ 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:05.997 * Looking for test storage... 00:06:05.997 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:05.997 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:05.997 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1445282 00:06:05.997 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:05.997 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:05.997 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1445282 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@830 -- # '[' -z 1445282 ']' 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:05.997 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:05.997 13:34:20 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:05.997 [2024-06-10 13:34:20.409874] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:05.997 [2024-06-10 13:34:20.409922] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1445282 ] 00:06:05.997 [2024-06-10 13:34:20.470612] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:06.258 [2024-06-10 13:34:20.528313] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.258 [2024-06-10 13:34:20.528432] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.258 [2024-06-10 13:34:20.528584] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:06:06.258 [2024-06-10 13:34:20.528586] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 3 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@863 -- # return 0 00:06:06.258 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:06.258 POWER: Env isn't set yet! 00:06:06.258 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:06.258 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:06.258 POWER: Cannot set governor of lcore 0 to userspace 00:06:06.258 POWER: Attempting to initialise PSTAT power management... 00:06:06.258 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:06.258 POWER: Initialized successfully for lcore 0 power management 00:06:06.258 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:06.258 POWER: Initialized successfully for lcore 1 power management 00:06:06.258 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:06.258 POWER: Initialized successfully for lcore 2 power management 00:06:06.258 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:06.258 POWER: Initialized successfully for lcore 3 power management 00:06:06.258 [2024-06-10 13:34:20.619524] scheduler_dynamic.c: 382:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:06.258 [2024-06-10 13:34:20.619543] scheduler_dynamic.c: 384:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:06.258 [2024-06-10 13:34:20.619549] scheduler_dynamic.c: 386:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.258 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:06.258 [2024-06-10 13:34:20.678986] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.258 13:34:20 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:06.258 13:34:20 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:06.258 ************************************ 00:06:06.258 START TEST scheduler_create_thread 00:06:06.258 ************************************ 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # scheduler_create_thread 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.258 2 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.258 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 3 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 4 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 5 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 6 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 7 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 8 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.519 9 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.519 13:34:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.780 10 00:06:06.780 13:34:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:07.041 13:34:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:07.041 13:34:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:07.041 13:34:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:08.491 13:34:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:08.491 13:34:22 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:08.491 13:34:22 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:08.491 13:34:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:08.491 13:34:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.061 13:34:23 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:09.061 13:34:23 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:09.061 13:34:23 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:09.061 13:34:23 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:09.999 13:34:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:09.999 13:34:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:09.999 13:34:24 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:09.999 13:34:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:09.999 13:34:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:10.568 13:34:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:10.568 00:06:10.568 real 0m4.223s 00:06:10.568 user 0m0.022s 00:06:10.568 sys 0m0.009s 00:06:10.568 13:34:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:10.568 13:34:24 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:10.568 ************************************ 00:06:10.569 END TEST scheduler_create_thread 00:06:10.569 ************************************ 00:06:10.569 13:34:24 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:10.569 13:34:24 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1445282 00:06:10.569 13:34:24 event.event_scheduler -- common/autotest_common.sh@949 -- # '[' -z 1445282 ']' 00:06:10.569 13:34:24 event.event_scheduler -- common/autotest_common.sh@953 -- # kill -0 1445282 00:06:10.569 13:34:24 event.event_scheduler -- common/autotest_common.sh@954 -- # uname 00:06:10.569 13:34:24 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:10.569 13:34:24 event.event_scheduler -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1445282 00:06:10.569 13:34:25 event.event_scheduler -- common/autotest_common.sh@955 -- # process_name=reactor_2 00:06:10.569 13:34:25 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' reactor_2 = sudo ']' 00:06:10.569 13:34:25 event.event_scheduler -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1445282' 00:06:10.569 killing process with pid 1445282 00:06:10.569 13:34:25 event.event_scheduler -- common/autotest_common.sh@968 -- # kill 1445282 00:06:10.569 13:34:25 event.event_scheduler -- common/autotest_common.sh@973 -- # wait 1445282 00:06:10.828 [2024-06-10 13:34:25.216180] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:11.089 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:11.089 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:11.089 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:11.089 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:11.089 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:11.089 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:11.089 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:11.089 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:11.089 00:06:11.089 real 0m5.131s 00:06:11.089 user 0m10.290s 00:06:11.089 sys 0m0.320s 00:06:11.089 13:34:25 event.event_scheduler -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:11.089 13:34:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:11.089 ************************************ 00:06:11.089 END TEST event_scheduler 00:06:11.089 ************************************ 00:06:11.089 13:34:25 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:11.089 13:34:25 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:11.089 13:34:25 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:11.089 13:34:25 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:11.089 13:34:25 event -- common/autotest_common.sh@10 -- # set +x 00:06:11.089 ************************************ 00:06:11.089 START TEST app_repeat 00:06:11.089 ************************************ 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@1124 -- # app_repeat_test 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1446385 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1446385' 00:06:11.089 Process app_repeat pid: 1446385 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:11.089 spdk_app_start Round 0 00:06:11.089 13:34:25 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1446385 /var/tmp/spdk-nbd.sock 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 1446385 ']' 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:11.089 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:11.089 13:34:25 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:11.089 [2024-06-10 13:34:25.518158] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:11.089 [2024-06-10 13:34:25.518254] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1446385 ] 00:06:11.349 [2024-06-10 13:34:25.619677] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:11.349 [2024-06-10 13:34:25.691392] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.349 [2024-06-10 13:34:25.691557] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.921 13:34:26 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:11.921 13:34:26 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:11.921 13:34:26 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:12.180 Malloc0 00:06:12.181 13:34:26 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:12.441 Malloc1 00:06:12.441 13:34:26 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.441 13:34:26 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:12.441 /dev/nbd0 00:06:12.701 13:34:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:12.701 13:34:26 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:12.701 1+0 records in 00:06:12.701 1+0 records out 00:06:12.701 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281718 s, 14.5 MB/s 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:12.701 13:34:26 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:12.701 13:34:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:12.701 13:34:26 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.702 13:34:26 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:12.702 /dev/nbd1 00:06:12.702 13:34:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:12.702 13:34:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:12.702 1+0 records in 00:06:12.702 1+0 records out 00:06:12.702 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294033 s, 13.9 MB/s 00:06:12.702 13:34:27 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.963 13:34:27 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:12.963 13:34:27 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.963 13:34:27 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:12.963 13:34:27 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:12.963 { 00:06:12.963 "nbd_device": "/dev/nbd0", 00:06:12.963 "bdev_name": "Malloc0" 00:06:12.963 }, 00:06:12.963 { 00:06:12.963 "nbd_device": "/dev/nbd1", 00:06:12.963 "bdev_name": "Malloc1" 00:06:12.963 } 00:06:12.963 ]' 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:12.963 { 00:06:12.963 "nbd_device": "/dev/nbd0", 00:06:12.963 "bdev_name": "Malloc0" 00:06:12.963 }, 00:06:12.963 { 00:06:12.963 "nbd_device": "/dev/nbd1", 00:06:12.963 "bdev_name": "Malloc1" 00:06:12.963 } 00:06:12.963 ]' 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:12.963 13:34:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:12.963 /dev/nbd1' 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:13.223 /dev/nbd1' 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:13.223 13:34:27 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:13.224 256+0 records in 00:06:13.224 256+0 records out 00:06:13.224 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0120983 s, 86.7 MB/s 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:13.224 256+0 records in 00:06:13.224 256+0 records out 00:06:13.224 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.017373 s, 60.4 MB/s 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:13.224 256+0 records in 00:06:13.224 256+0 records out 00:06:13.224 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0169657 s, 61.8 MB/s 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:13.224 13:34:27 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:13.484 13:34:27 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.743 13:34:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:13.743 13:34:28 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:13.743 13:34:28 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:14.003 13:34:28 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:14.263 [2024-06-10 13:34:28.552731] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:14.263 [2024-06-10 13:34:28.616185] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.263 [2024-06-10 13:34:28.616224] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.264 [2024-06-10 13:34:28.648688] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:14.264 [2024-06-10 13:34:28.648718] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:17.561 13:34:31 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:17.562 13:34:31 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:17.562 spdk_app_start Round 1 00:06:17.562 13:34:31 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1446385 /var/tmp/spdk-nbd.sock 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 1446385 ']' 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:17.562 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:17.562 13:34:31 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:17.562 13:34:31 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:17.562 Malloc0 00:06:17.562 13:34:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:17.562 Malloc1 00:06:17.562 13:34:32 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:17.562 13:34:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:17.822 /dev/nbd0 00:06:17.822 13:34:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:17.822 13:34:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:17.822 1+0 records in 00:06:17.822 1+0 records out 00:06:17.822 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221756 s, 18.5 MB/s 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:17.822 13:34:32 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:17.822 13:34:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:17.822 13:34:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:17.822 13:34:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:18.082 /dev/nbd1 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:18.082 1+0 records in 00:06:18.082 1+0 records out 00:06:18.082 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000357184 s, 11.5 MB/s 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:18.082 13:34:32 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.082 13:34:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:18.341 13:34:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:18.341 { 00:06:18.341 "nbd_device": "/dev/nbd0", 00:06:18.341 "bdev_name": "Malloc0" 00:06:18.342 }, 00:06:18.342 { 00:06:18.342 "nbd_device": "/dev/nbd1", 00:06:18.342 "bdev_name": "Malloc1" 00:06:18.342 } 00:06:18.342 ]' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:18.342 { 00:06:18.342 "nbd_device": "/dev/nbd0", 00:06:18.342 "bdev_name": "Malloc0" 00:06:18.342 }, 00:06:18.342 { 00:06:18.342 "nbd_device": "/dev/nbd1", 00:06:18.342 "bdev_name": "Malloc1" 00:06:18.342 } 00:06:18.342 ]' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:18.342 /dev/nbd1' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:18.342 /dev/nbd1' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:18.342 256+0 records in 00:06:18.342 256+0 records out 00:06:18.342 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011684 s, 89.7 MB/s 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:18.342 256+0 records in 00:06:18.342 256+0 records out 00:06:18.342 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.016011 s, 65.5 MB/s 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:18.342 13:34:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:18.602 256+0 records in 00:06:18.602 256+0 records out 00:06:18.602 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.039332 s, 26.7 MB/s 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:18.602 13:34:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:18.602 13:34:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.862 13:34:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:19.122 13:34:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:19.123 13:34:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:19.123 13:34:33 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:19.383 13:34:33 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:19.643 [2024-06-10 13:34:33.914098] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:19.643 [2024-06-10 13:34:33.977748] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:19.643 [2024-06-10 13:34:33.977753] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.643 [2024-06-10 13:34:34.011011] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:19.643 [2024-06-10 13:34:34.011043] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:22.938 13:34:36 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:22.938 13:34:36 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:22.938 spdk_app_start Round 2 00:06:22.938 13:34:36 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1446385 /var/tmp/spdk-nbd.sock 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 1446385 ']' 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:22.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:22.938 13:34:36 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:22.938 13:34:36 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:22.938 Malloc0 00:06:22.938 13:34:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:22.938 Malloc1 00:06:22.938 13:34:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:22.938 13:34:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:23.200 /dev/nbd0 00:06:23.200 13:34:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:23.200 13:34:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:23.200 1+0 records in 00:06:23.200 1+0 records out 00:06:23.200 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250554 s, 16.3 MB/s 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:23.200 13:34:37 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:23.200 13:34:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:23.200 13:34:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:23.200 13:34:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:23.461 /dev/nbd1 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:23.461 1+0 records in 00:06:23.461 1+0 records out 00:06:23.461 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220358 s, 18.6 MB/s 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:23.461 13:34:37 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:23.461 13:34:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:23.722 { 00:06:23.722 "nbd_device": "/dev/nbd0", 00:06:23.722 "bdev_name": "Malloc0" 00:06:23.722 }, 00:06:23.722 { 00:06:23.722 "nbd_device": "/dev/nbd1", 00:06:23.722 "bdev_name": "Malloc1" 00:06:23.722 } 00:06:23.722 ]' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:23.722 { 00:06:23.722 "nbd_device": "/dev/nbd0", 00:06:23.722 "bdev_name": "Malloc0" 00:06:23.722 }, 00:06:23.722 { 00:06:23.722 "nbd_device": "/dev/nbd1", 00:06:23.722 "bdev_name": "Malloc1" 00:06:23.722 } 00:06:23.722 ]' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:23.722 /dev/nbd1' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:23.722 /dev/nbd1' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:23.722 256+0 records in 00:06:23.722 256+0 records out 00:06:23.722 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0122345 s, 85.7 MB/s 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:23.722 13:34:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:23.983 256+0 records in 00:06:23.983 256+0 records out 00:06:23.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0356636 s, 29.4 MB/s 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:23.983 256+0 records in 00:06:23.983 256+0 records out 00:06:23.983 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0173183 s, 60.5 MB/s 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:23.983 13:34:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:24.244 13:34:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:24.504 13:34:38 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:24.504 13:34:38 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:24.764 13:34:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:25.024 [2024-06-10 13:34:39.256538] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:25.024 [2024-06-10 13:34:39.318745] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.024 [2024-06-10 13:34:39.318750] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.024 [2024-06-10 13:34:39.351253] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:25.024 [2024-06-10 13:34:39.351285] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:28.322 13:34:42 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1446385 /var/tmp/spdk-nbd.sock 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 1446385 ']' 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:28.322 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:28.322 13:34:42 event.app_repeat -- event/event.sh@39 -- # killprocess 1446385 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@949 -- # '[' -z 1446385 ']' 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@953 -- # kill -0 1446385 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@954 -- # uname 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1446385 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1446385' 00:06:28.322 killing process with pid 1446385 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@968 -- # kill 1446385 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@973 -- # wait 1446385 00:06:28.322 spdk_app_start is called in Round 0. 00:06:28.322 Shutdown signal received, stop current app iteration 00:06:28.322 Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 reinitialization... 00:06:28.322 spdk_app_start is called in Round 1. 00:06:28.322 Shutdown signal received, stop current app iteration 00:06:28.322 Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 reinitialization... 00:06:28.322 spdk_app_start is called in Round 2. 00:06:28.322 Shutdown signal received, stop current app iteration 00:06:28.322 Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 reinitialization... 00:06:28.322 spdk_app_start is called in Round 3. 00:06:28.322 Shutdown signal received, stop current app iteration 00:06:28.322 13:34:42 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:28.322 13:34:42 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:28.322 00:06:28.322 real 0m17.015s 00:06:28.322 user 0m37.405s 00:06:28.322 sys 0m2.462s 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:28.322 13:34:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:28.322 ************************************ 00:06:28.322 END TEST app_repeat 00:06:28.322 ************************************ 00:06:28.322 13:34:42 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:28.322 00:06:28.322 real 0m26.371s 00:06:28.322 user 0m54.319s 00:06:28.322 sys 0m3.398s 00:06:28.322 13:34:42 event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:28.322 13:34:42 event -- common/autotest_common.sh@10 -- # set +x 00:06:28.322 ************************************ 00:06:28.322 END TEST event 00:06:28.322 ************************************ 00:06:28.322 13:34:42 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:28.322 13:34:42 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:28.322 13:34:42 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:28.322 13:34:42 -- common/autotest_common.sh@10 -- # set +x 00:06:28.322 ************************************ 00:06:28.322 START TEST thread 00:06:28.322 ************************************ 00:06:28.322 13:34:42 thread -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:28.322 * Looking for test storage... 00:06:28.322 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:28.322 13:34:42 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:28.322 13:34:42 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:06:28.322 13:34:42 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:28.322 13:34:42 thread -- common/autotest_common.sh@10 -- # set +x 00:06:28.322 ************************************ 00:06:28.322 START TEST thread_poller_perf 00:06:28.322 ************************************ 00:06:28.322 13:34:42 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:28.322 [2024-06-10 13:34:42.761612] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:28.322 [2024-06-10 13:34:42.761700] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450011 ] 00:06:28.583 [2024-06-10 13:34:42.855530] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.583 [2024-06-10 13:34:42.932266] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.583 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:29.522 ====================================== 00:06:29.522 busy:2408212178 (cyc) 00:06:29.522 total_run_count: 284000 00:06:29.522 tsc_hz: 2400000000 (cyc) 00:06:29.522 ====================================== 00:06:29.522 poller_cost: 8479 (cyc), 3532 (nsec) 00:06:29.522 00:06:29.522 real 0m1.259s 00:06:29.522 user 0m1.158s 00:06:29.522 sys 0m0.096s 00:06:29.522 13:34:43 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:29.522 13:34:43 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:29.522 ************************************ 00:06:29.522 END TEST thread_poller_perf 00:06:29.522 ************************************ 00:06:29.783 13:34:44 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:29.783 13:34:44 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:06:29.783 13:34:44 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:29.783 13:34:44 thread -- common/autotest_common.sh@10 -- # set +x 00:06:29.783 ************************************ 00:06:29.783 START TEST thread_poller_perf 00:06:29.783 ************************************ 00:06:29.783 13:34:44 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:29.783 [2024-06-10 13:34:44.099712] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:29.783 [2024-06-10 13:34:44.099799] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450356 ] 00:06:29.783 [2024-06-10 13:34:44.190122] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.783 [2024-06-10 13:34:44.257986] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.783 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:31.201 ====================================== 00:06:31.201 busy:2401824896 (cyc) 00:06:31.201 total_run_count: 3816000 00:06:31.201 tsc_hz: 2400000000 (cyc) 00:06:31.201 ====================================== 00:06:31.201 poller_cost: 629 (cyc), 262 (nsec) 00:06:31.201 00:06:31.202 real 0m1.240s 00:06:31.202 user 0m1.145s 00:06:31.202 sys 0m0.091s 00:06:31.202 13:34:45 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:31.202 13:34:45 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:31.202 ************************************ 00:06:31.202 END TEST thread_poller_perf 00:06:31.202 ************************************ 00:06:31.202 13:34:45 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:31.202 00:06:31.202 real 0m2.752s 00:06:31.202 user 0m2.407s 00:06:31.202 sys 0m0.353s 00:06:31.202 13:34:45 thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:31.202 13:34:45 thread -- common/autotest_common.sh@10 -- # set +x 00:06:31.202 ************************************ 00:06:31.202 END TEST thread 00:06:31.202 ************************************ 00:06:31.202 13:34:45 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:31.202 13:34:45 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:31.202 13:34:45 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:31.202 13:34:45 -- common/autotest_common.sh@10 -- # set +x 00:06:31.202 ************************************ 00:06:31.202 START TEST accel 00:06:31.202 ************************************ 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:31.202 * Looking for test storage... 00:06:31.202 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:31.202 13:34:45 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:31.202 13:34:45 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:31.202 13:34:45 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:31.202 13:34:45 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1450623 00:06:31.202 13:34:45 accel -- accel/accel.sh@63 -- # waitforlisten 1450623 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@830 -- # '[' -z 1450623 ']' 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:31.202 13:34:45 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:31.202 13:34:45 accel -- common/autotest_common.sh@10 -- # set +x 00:06:31.202 13:34:45 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:31.202 13:34:45 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:31.202 13:34:45 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:31.202 13:34:45 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.202 13:34:45 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.202 13:34:45 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:31.202 13:34:45 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:31.202 13:34:45 accel -- accel/accel.sh@41 -- # jq -r . 00:06:31.202 [2024-06-10 13:34:45.606968] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:31.202 [2024-06-10 13:34:45.607035] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450623 ] 00:06:31.463 [2024-06-10 13:34:45.701791] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.463 [2024-06-10 13:34:45.780657] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.034 13:34:46 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:32.034 13:34:46 accel -- common/autotest_common.sh@863 -- # return 0 00:06:32.034 13:34:46 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:32.034 13:34:46 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:32.034 13:34:46 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:32.034 13:34:46 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:32.034 13:34:46 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:32.034 13:34:46 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:32.034 13:34:46 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:32.034 13:34:46 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:32.034 13:34:46 accel -- common/autotest_common.sh@10 -- # set +x 00:06:32.034 13:34:46 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:32.034 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.034 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.034 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.034 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.034 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.034 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.034 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.034 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.035 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.035 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.035 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.295 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.295 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.295 13:34:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:32.295 13:34:46 accel -- accel/accel.sh@72 -- # IFS== 00:06:32.295 13:34:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:32.295 13:34:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:32.295 13:34:46 accel -- accel/accel.sh@75 -- # killprocess 1450623 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@949 -- # '[' -z 1450623 ']' 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@953 -- # kill -0 1450623 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@954 -- # uname 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1450623 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1450623' 00:06:32.295 killing process with pid 1450623 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@968 -- # kill 1450623 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@973 -- # wait 1450623 00:06:32.295 13:34:46 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:32.295 13:34:46 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:32.295 13:34:46 accel -- common/autotest_common.sh@10 -- # set +x 00:06:32.555 13:34:46 accel.accel_help -- common/autotest_common.sh@1124 -- # accel_perf -h 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:32.555 13:34:46 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:32.555 13:34:46 accel.accel_help -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:32.555 13:34:46 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:32.555 13:34:46 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:32.555 13:34:46 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:32.555 13:34:46 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:32.555 13:34:46 accel -- common/autotest_common.sh@10 -- # set +x 00:06:32.555 ************************************ 00:06:32.555 START TEST accel_missing_filename 00:06:32.555 ************************************ 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@649 -- # local es=0 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:32.555 13:34:46 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:32.555 13:34:46 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:32.556 [2024-06-10 13:34:46.948480] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:32.556 [2024-06-10 13:34:46.948560] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450825 ] 00:06:32.816 [2024-06-10 13:34:47.043997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.816 [2024-06-10 13:34:47.121331] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.816 [2024-06-10 13:34:47.166764] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:32.816 [2024-06-10 13:34:47.203909] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:32.816 A filename is required. 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # es=234 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # es=106 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # case "$es" in 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@669 -- # es=1 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:32.816 00:06:32.816 real 0m0.348s 00:06:32.816 user 0m0.243s 00:06:32.816 sys 0m0.133s 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:32.816 13:34:47 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:32.816 ************************************ 00:06:32.816 END TEST accel_missing_filename 00:06:32.816 ************************************ 00:06:33.076 13:34:47 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.076 13:34:47 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:06:33.076 13:34:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:33.076 13:34:47 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.076 ************************************ 00:06:33.076 START TEST accel_compress_verify 00:06:33.076 ************************************ 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@649 -- # local es=0 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:33.076 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:33.076 13:34:47 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:33.076 [2024-06-10 13:34:47.354046] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:33.076 [2024-06-10 13:34:47.354089] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1451058 ] 00:06:33.076 [2024-06-10 13:34:47.439245] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.076 [2024-06-10 13:34:47.503923] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.076 [2024-06-10 13:34:47.546522] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:33.337 [2024-06-10 13:34:47.583304] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:33.337 00:06:33.337 Compression does not support the verify option, aborting. 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # es=161 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # es=33 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # case "$es" in 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@669 -- # es=1 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:33.337 00:06:33.337 real 0m0.304s 00:06:33.337 user 0m0.216s 00:06:33.337 sys 0m0.119s 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:33.337 13:34:47 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:33.337 ************************************ 00:06:33.337 END TEST accel_compress_verify 00:06:33.337 ************************************ 00:06:33.337 13:34:47 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:33.337 13:34:47 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:33.337 13:34:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:33.337 13:34:47 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.337 ************************************ 00:06:33.337 START TEST accel_wrong_workload 00:06:33.337 ************************************ 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w foobar 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@649 -- # local es=0 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w foobar 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:33.337 13:34:47 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:33.337 Unsupported workload type: foobar 00:06:33.337 [2024-06-10 13:34:47.750683] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:33.337 accel_perf options: 00:06:33.337 [-h help message] 00:06:33.337 [-q queue depth per core] 00:06:33.337 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:33.337 [-T number of threads per core 00:06:33.337 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:33.337 [-t time in seconds] 00:06:33.337 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:33.337 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:33.337 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:33.337 [-l for compress/decompress workloads, name of uncompressed input file 00:06:33.337 [-S for crc32c workload, use this seed value (default 0) 00:06:33.337 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:33.337 [-f for fill workload, use this BYTE value (default 255) 00:06:33.337 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:33.337 [-y verify result if this switch is on] 00:06:33.337 [-a tasks to allocate per core (default: same value as -q)] 00:06:33.337 Can be used to spread operations across a wider range of memory. 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # es=1 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:33.337 00:06:33.337 real 0m0.042s 00:06:33.337 user 0m0.027s 00:06:33.337 sys 0m0.015s 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:33.337 13:34:47 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:33.337 ************************************ 00:06:33.337 END TEST accel_wrong_workload 00:06:33.337 ************************************ 00:06:33.337 Error: writing output failed: Broken pipe 00:06:33.337 13:34:47 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:33.337 13:34:47 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:06:33.337 13:34:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:33.337 13:34:47 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.598 ************************************ 00:06:33.598 START TEST accel_negative_buffers 00:06:33.598 ************************************ 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@649 -- # local es=0 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w xor -y -x -1 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:33.598 13:34:47 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:33.598 -x option must be non-negative. 00:06:33.598 [2024-06-10 13:34:47.868480] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:33.598 accel_perf options: 00:06:33.598 [-h help message] 00:06:33.598 [-q queue depth per core] 00:06:33.598 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:33.598 [-T number of threads per core 00:06:33.598 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:33.598 [-t time in seconds] 00:06:33.598 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:33.598 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:33.598 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:33.598 [-l for compress/decompress workloads, name of uncompressed input file 00:06:33.598 [-S for crc32c workload, use this seed value (default 0) 00:06:33.598 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:33.598 [-f for fill workload, use this BYTE value (default 255) 00:06:33.598 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:33.598 [-y verify result if this switch is on] 00:06:33.598 [-a tasks to allocate per core (default: same value as -q)] 00:06:33.598 Can be used to spread operations across a wider range of memory. 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # es=1 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:33.598 00:06:33.598 real 0m0.041s 00:06:33.598 user 0m0.025s 00:06:33.598 sys 0m0.016s 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:33.598 13:34:47 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:33.598 ************************************ 00:06:33.598 END TEST accel_negative_buffers 00:06:33.598 ************************************ 00:06:33.598 Error: writing output failed: Broken pipe 00:06:33.598 13:34:47 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:33.598 13:34:47 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:33.598 13:34:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:33.598 13:34:47 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.598 ************************************ 00:06:33.598 START TEST accel_crc32c 00:06:33.598 ************************************ 00:06:33.598 13:34:47 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:33.598 13:34:47 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:33.598 [2024-06-10 13:34:47.987082] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:33.598 [2024-06-10 13:34:47.987189] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1451200 ] 00:06:33.858 [2024-06-10 13:34:48.077579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.858 [2024-06-10 13:34:48.144542] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.858 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:33.859 13:34:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:35.242 13:34:49 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.242 00:06:35.242 real 0m1.336s 00:06:35.242 user 0m1.211s 00:06:35.242 sys 0m0.126s 00:06:35.242 13:34:49 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:35.242 13:34:49 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:35.242 ************************************ 00:06:35.242 END TEST accel_crc32c 00:06:35.242 ************************************ 00:06:35.242 13:34:49 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:35.242 13:34:49 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:35.242 13:34:49 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:35.242 13:34:49 accel -- common/autotest_common.sh@10 -- # set +x 00:06:35.242 ************************************ 00:06:35.242 START TEST accel_crc32c_C2 00:06:35.242 ************************************ 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:35.242 [2024-06-10 13:34:49.397039] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:35.242 [2024-06-10 13:34:49.397093] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1451545 ] 00:06:35.242 [2024-06-10 13:34:49.488484] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.242 [2024-06-10 13:34:49.563526] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.242 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:35.243 13:34:49 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:36.626 00:06:36.626 real 0m1.340s 00:06:36.626 user 0m1.221s 00:06:36.626 sys 0m0.123s 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:36.626 13:34:50 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:36.626 ************************************ 00:06:36.626 END TEST accel_crc32c_C2 00:06:36.626 ************************************ 00:06:36.626 13:34:50 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:36.626 13:34:50 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:36.626 13:34:50 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:36.626 13:34:50 accel -- common/autotest_common.sh@10 -- # set +x 00:06:36.626 ************************************ 00:06:36.626 START TEST accel_copy 00:06:36.626 ************************************ 00:06:36.626 13:34:50 accel.accel_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy -y 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:36.626 13:34:50 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:36.626 [2024-06-10 13:34:50.814196] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:36.626 [2024-06-10 13:34:50.814256] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1451893 ] 00:06:36.626 [2024-06-10 13:34:50.904060] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.626 [2024-06-10 13:34:50.979790] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:36.626 13:34:51 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:38.011 13:34:52 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.011 00:06:38.011 real 0m1.341s 00:06:38.011 user 0m1.226s 00:06:38.011 sys 0m0.116s 00:06:38.011 13:34:52 accel.accel_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:38.011 13:34:52 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:38.012 ************************************ 00:06:38.012 END TEST accel_copy 00:06:38.012 ************************************ 00:06:38.012 13:34:52 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.012 13:34:52 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:06:38.012 13:34:52 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:38.012 13:34:52 accel -- common/autotest_common.sh@10 -- # set +x 00:06:38.012 ************************************ 00:06:38.012 START TEST accel_fill 00:06:38.012 ************************************ 00:06:38.012 13:34:52 accel.accel_fill -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:38.012 [2024-06-10 13:34:52.233756] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:38.012 [2024-06-10 13:34:52.233816] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1452120 ] 00:06:38.012 [2024-06-10 13:34:52.324255] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.012 [2024-06-10 13:34:52.394573] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:38.012 13:34:52 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:39.394 13:34:53 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.394 00:06:39.394 real 0m1.342s 00:06:39.394 user 0m1.210s 00:06:39.394 sys 0m0.129s 00:06:39.394 13:34:53 accel.accel_fill -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:39.394 13:34:53 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:06:39.394 ************************************ 00:06:39.394 END TEST accel_fill 00:06:39.394 ************************************ 00:06:39.394 13:34:53 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:39.394 13:34:53 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:39.394 13:34:53 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:39.394 13:34:53 accel -- common/autotest_common.sh@10 -- # set +x 00:06:39.394 ************************************ 00:06:39.394 START TEST accel_copy_crc32c 00:06:39.394 ************************************ 00:06:39.394 13:34:53 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y 00:06:39.394 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:39.394 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:39.395 [2024-06-10 13:34:53.652911] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:39.395 [2024-06-10 13:34:53.652973] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1452339 ] 00:06:39.395 [2024-06-10 13:34:53.742011] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.395 [2024-06-10 13:34:53.808402] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.395 13:34:53 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.776 00:06:40.776 real 0m1.331s 00:06:40.776 user 0m1.211s 00:06:40.776 sys 0m0.122s 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:40.776 13:34:54 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:40.776 ************************************ 00:06:40.776 END TEST accel_copy_crc32c 00:06:40.776 ************************************ 00:06:40.776 13:34:54 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:40.776 13:34:54 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:40.776 13:34:54 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:40.776 13:34:54 accel -- common/autotest_common.sh@10 -- # set +x 00:06:40.776 ************************************ 00:06:40.776 START TEST accel_copy_crc32c_C2 00:06:40.776 ************************************ 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:40.776 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:40.776 [2024-06-10 13:34:55.059010] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:40.776 [2024-06-10 13:34:55.059065] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1452622 ] 00:06:40.776 [2024-06-10 13:34:55.149809] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.776 [2024-06-10 13:34:55.223809] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.038 13:34:55 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.980 00:06:41.980 real 0m1.342s 00:06:41.980 user 0m1.220s 00:06:41.980 sys 0m0.121s 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:41.980 13:34:56 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:41.980 ************************************ 00:06:41.980 END TEST accel_copy_crc32c_C2 00:06:41.980 ************************************ 00:06:41.980 13:34:56 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:41.980 13:34:56 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:41.980 13:34:56 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:41.980 13:34:56 accel -- common/autotest_common.sh@10 -- # set +x 00:06:41.980 ************************************ 00:06:41.980 START TEST accel_dualcast 00:06:41.980 ************************************ 00:06:41.980 13:34:56 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dualcast -y 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:06:41.980 13:34:56 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:06:42.241 [2024-06-10 13:34:56.479338] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:42.241 [2024-06-10 13:34:56.479394] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1452963 ] 00:06:42.241 [2024-06-10 13:34:56.571022] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.241 [2024-06-10 13:34:56.638982] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.241 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:42.242 13:34:56 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:43.702 13:34:57 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.702 00:06:43.702 real 0m1.336s 00:06:43.702 user 0m1.216s 00:06:43.702 sys 0m0.121s 00:06:43.702 13:34:57 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:43.702 13:34:57 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:06:43.702 ************************************ 00:06:43.702 END TEST accel_dualcast 00:06:43.702 ************************************ 00:06:43.702 13:34:57 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:43.702 13:34:57 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:43.702 13:34:57 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:43.702 13:34:57 accel -- common/autotest_common.sh@10 -- # set +x 00:06:43.702 ************************************ 00:06:43.702 START TEST accel_compare 00:06:43.702 ************************************ 00:06:43.702 13:34:57 accel.accel_compare -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compare -y 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:06:43.702 13:34:57 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:06:43.702 [2024-06-10 13:34:57.887697] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:43.703 [2024-06-10 13:34:57.887754] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1453316 ] 00:06:43.703 [2024-06-10 13:34:57.976690] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.703 [2024-06-10 13:34:58.041246] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:43.703 13:34:58 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:45.087 13:34:59 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.087 00:06:45.087 real 0m1.326s 00:06:45.087 user 0m1.207s 00:06:45.087 sys 0m0.123s 00:06:45.087 13:34:59 accel.accel_compare -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:45.087 13:34:59 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:06:45.087 ************************************ 00:06:45.087 END TEST accel_compare 00:06:45.087 ************************************ 00:06:45.087 13:34:59 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:45.087 13:34:59 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:45.087 13:34:59 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:45.087 13:34:59 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.087 ************************************ 00:06:45.087 START TEST accel_xor 00:06:45.087 ************************************ 00:06:45.087 13:34:59 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:45.087 [2024-06-10 13:34:59.288828] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:45.087 [2024-06-10 13:34:59.288886] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1453660 ] 00:06:45.087 [2024-06-10 13:34:59.376787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.087 [2024-06-10 13:34:59.442527] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:45.087 13:34:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.472 00:06:46.472 real 0m1.327s 00:06:46.472 user 0m1.208s 00:06:46.472 sys 0m0.124s 00:06:46.472 13:35:00 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:46.472 13:35:00 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:46.472 ************************************ 00:06:46.472 END TEST accel_xor 00:06:46.472 ************************************ 00:06:46.472 13:35:00 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:46.472 13:35:00 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:46.472 13:35:00 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:46.472 13:35:00 accel -- common/autotest_common.sh@10 -- # set +x 00:06:46.472 ************************************ 00:06:46.472 START TEST accel_xor 00:06:46.472 ************************************ 00:06:46.472 13:35:00 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y -x 3 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:46.472 [2024-06-10 13:35:00.692673] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:46.472 [2024-06-10 13:35:00.692733] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1453919 ] 00:06:46.472 [2024-06-10 13:35:00.783254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.472 [2024-06-10 13:35:00.860852] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:46.472 13:35:00 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:46.473 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:46.473 13:35:00 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:47.856 13:35:02 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:47.856 13:35:02 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.856 00:06:47.856 real 0m1.343s 00:06:47.856 user 0m1.219s 00:06:47.856 sys 0m0.126s 00:06:47.856 13:35:02 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:47.856 13:35:02 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:47.856 ************************************ 00:06:47.856 END TEST accel_xor 00:06:47.856 ************************************ 00:06:47.856 13:35:02 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:47.856 13:35:02 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:47.856 13:35:02 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:47.856 13:35:02 accel -- common/autotest_common.sh@10 -- # set +x 00:06:47.856 ************************************ 00:06:47.856 START TEST accel_dif_verify 00:06:47.856 ************************************ 00:06:47.856 13:35:02 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_verify 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:06:47.856 [2024-06-10 13:35:02.112606] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:47.856 [2024-06-10 13:35:02.112667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1454130 ] 00:06:47.856 [2024-06-10 13:35:02.199872] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.856 [2024-06-10 13:35:02.269722] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:47.856 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.116 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:48.117 13:35:02 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:06:49.058 13:35:03 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.058 00:06:49.058 real 0m1.340s 00:06:49.058 user 0m1.224s 00:06:49.058 sys 0m0.113s 00:06:49.058 13:35:03 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:49.058 13:35:03 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:06:49.058 ************************************ 00:06:49.058 END TEST accel_dif_verify 00:06:49.058 ************************************ 00:06:49.058 13:35:03 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:49.058 13:35:03 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:49.058 13:35:03 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:49.058 13:35:03 accel -- common/autotest_common.sh@10 -- # set +x 00:06:49.058 ************************************ 00:06:49.058 START TEST accel_dif_generate 00:06:49.058 ************************************ 00:06:49.058 13:35:03 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate 00:06:49.058 13:35:03 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:06:49.058 13:35:03 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:06:49.058 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.058 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:06:49.059 13:35:03 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:06:49.059 [2024-06-10 13:35:03.528046] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:49.059 [2024-06-10 13:35:03.528103] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1454386 ] 00:06:49.320 [2024-06-10 13:35:03.617465] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.320 [2024-06-10 13:35:03.687134] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:06:49.320 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:49.321 13:35:03 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:06:50.704 13:35:04 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.704 00:06:50.704 real 0m1.343s 00:06:50.704 user 0m1.214s 00:06:50.704 sys 0m0.125s 00:06:50.704 13:35:04 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:50.704 13:35:04 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:06:50.704 ************************************ 00:06:50.704 END TEST accel_dif_generate 00:06:50.704 ************************************ 00:06:50.704 13:35:04 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:50.704 13:35:04 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:50.704 13:35:04 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:50.704 13:35:04 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.704 ************************************ 00:06:50.704 START TEST accel_dif_generate_copy 00:06:50.704 ************************************ 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate_copy 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:50.704 13:35:04 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:06:50.704 [2024-06-10 13:35:04.946758] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:50.704 [2024-06-10 13:35:04.946857] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1454735 ] 00:06:50.704 [2024-06-10 13:35:05.043265] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.704 [2024-06-10 13:35:05.107213] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.704 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:50.705 13:35:05 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.090 00:06:52.090 real 0m1.335s 00:06:52.090 user 0m1.212s 00:06:52.090 sys 0m0.127s 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:52.090 13:35:06 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:06:52.090 ************************************ 00:06:52.090 END TEST accel_dif_generate_copy 00:06:52.090 ************************************ 00:06:52.090 13:35:06 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:06:52.090 13:35:06 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.090 13:35:06 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:06:52.090 13:35:06 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:52.090 13:35:06 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.090 ************************************ 00:06:52.090 START TEST accel_comp 00:06:52.090 ************************************ 00:06:52.090 13:35:06 accel.accel_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:06:52.090 [2024-06-10 13:35:06.355723] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:52.090 [2024-06-10 13:35:06.355778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1455079 ] 00:06:52.090 [2024-06-10 13:35:06.445002] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.090 [2024-06-10 13:35:06.514010] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.090 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.350 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.351 13:35:06 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:53.292 13:35:07 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.292 00:06:53.292 real 0m1.341s 00:06:53.292 user 0m1.218s 00:06:53.292 sys 0m0.122s 00:06:53.292 13:35:07 accel.accel_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:53.292 13:35:07 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:06:53.292 ************************************ 00:06:53.292 END TEST accel_comp 00:06:53.292 ************************************ 00:06:53.292 13:35:07 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.292 13:35:07 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:53.292 13:35:07 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:53.292 13:35:07 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.292 ************************************ 00:06:53.292 START TEST accel_decomp 00:06:53.292 ************************************ 00:06:53.292 13:35:07 accel.accel_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:53.292 13:35:07 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:53.553 [2024-06-10 13:35:07.771884] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:53.553 [2024-06-10 13:35:07.771947] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1455420 ] 00:06:53.553 [2024-06-10 13:35:07.858302] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.553 [2024-06-10 13:35:07.927138] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.553 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:53.554 13:35:07 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:54.939 13:35:09 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.939 00:06:54.939 real 0m1.340s 00:06:54.939 user 0m1.213s 00:06:54.939 sys 0m0.123s 00:06:54.939 13:35:09 accel.accel_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:54.939 13:35:09 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:06:54.939 ************************************ 00:06:54.939 END TEST accel_decomp 00:06:54.939 ************************************ 00:06:54.939 13:35:09 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:54.939 13:35:09 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:06:54.939 13:35:09 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:54.939 13:35:09 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.939 ************************************ 00:06:54.939 START TEST accel_decomp_full 00:06:54.939 ************************************ 00:06:54.939 13:35:09 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:06:54.939 13:35:09 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:06:54.939 [2024-06-10 13:35:09.190305] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:54.939 [2024-06-10 13:35:09.190373] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1455751 ] 00:06:54.939 [2024-06-10 13:35:09.281541] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.940 [2024-06-10 13:35:09.355423] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:54.940 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.201 13:35:09 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:56.143 13:35:10 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.143 00:06:56.143 real 0m1.352s 00:06:56.143 user 0m1.218s 00:06:56.143 sys 0m0.138s 00:06:56.143 13:35:10 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:56.143 13:35:10 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:06:56.143 ************************************ 00:06:56.143 END TEST accel_decomp_full 00:06:56.143 ************************************ 00:06:56.143 13:35:10 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:56.143 13:35:10 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:06:56.143 13:35:10 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:56.143 13:35:10 accel -- common/autotest_common.sh@10 -- # set +x 00:06:56.143 ************************************ 00:06:56.143 START TEST accel_decomp_mcore 00:06:56.144 ************************************ 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:56.144 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:56.406 [2024-06-10 13:35:10.620041] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:56.406 [2024-06-10 13:35:10.620093] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1455967 ] 00:06:56.406 [2024-06-10 13:35:10.710497] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:56.406 [2024-06-10 13:35:10.784059] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:56.406 [2024-06-10 13:35:10.784173] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:06:56.406 [2024-06-10 13:35:10.784312] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.406 [2024-06-10 13:35:10.784312] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 3 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.406 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:56.407 13:35:10 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.802 00:06:57.802 real 0m1.347s 00:06:57.802 user 0m4.486s 00:06:57.802 sys 0m0.128s 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:57.802 13:35:11 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:57.802 ************************************ 00:06:57.802 END TEST accel_decomp_mcore 00:06:57.802 ************************************ 00:06:57.802 13:35:11 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:57.802 13:35:11 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:06:57.802 13:35:11 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:57.802 13:35:11 accel -- common/autotest_common.sh@10 -- # set +x 00:06:57.802 ************************************ 00:06:57.802 START TEST accel_decomp_full_mcore 00:06:57.802 ************************************ 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:57.802 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:57.803 [2024-06-10 13:35:12.045197] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:57.803 [2024-06-10 13:35:12.045259] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1456185 ] 00:06:57.803 [2024-06-10 13:35:12.135828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:57.803 [2024-06-10 13:35:12.210420] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:06:57.803 [2024-06-10 13:35:12.210554] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:06:57.803 [2024-06-10 13:35:12.210712] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.803 [2024-06-10 13:35:12.210712] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.803 13:35:12 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.189 00:06:59.189 real 0m1.361s 00:06:59.189 user 0m4.525s 00:06:59.189 sys 0m0.142s 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:59.189 13:35:13 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:59.189 ************************************ 00:06:59.189 END TEST accel_decomp_full_mcore 00:06:59.189 ************************************ 00:06:59.189 13:35:13 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.189 13:35:13 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:06:59.189 13:35:13 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:59.189 13:35:13 accel -- common/autotest_common.sh@10 -- # set +x 00:06:59.189 ************************************ 00:06:59.189 START TEST accel_decomp_mthread 00:06:59.189 ************************************ 00:06:59.189 13:35:13 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.189 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:59.189 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:59.189 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.189 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.189 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:59.190 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:59.190 [2024-06-10 13:35:13.477423] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:06:59.190 [2024-06-10 13:35:13.477483] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1456505 ] 00:06:59.190 [2024-06-10 13:35:13.564852] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.190 [2024-06-10 13:35:13.629792] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.451 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:59.452 13:35:13 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.409 00:07:00.409 real 0m1.343s 00:07:00.409 user 0m1.217s 00:07:00.409 sys 0m0.121s 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:00.409 13:35:14 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:00.409 ************************************ 00:07:00.409 END TEST accel_decomp_mthread 00:07:00.409 ************************************ 00:07:00.409 13:35:14 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.409 13:35:14 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:00.409 13:35:14 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:00.409 13:35:14 accel -- common/autotest_common.sh@10 -- # set +x 00:07:00.409 ************************************ 00:07:00.409 START TEST accel_decomp_full_mthread 00:07:00.409 ************************************ 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:00.409 13:35:14 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:00.672 [2024-06-10 13:35:14.893812] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:00.672 [2024-06-10 13:35:14.893881] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1456846 ] 00:07:00.672 [2024-06-10 13:35:14.982099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.672 [2024-06-10 13:35:15.046603] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:00.672 13:35:15 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.057 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.058 00:07:02.058 real 0m1.364s 00:07:02.058 user 0m1.243s 00:07:02.058 sys 0m0.123s 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:02.058 13:35:16 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:02.058 ************************************ 00:07:02.058 END TEST accel_decomp_full_mthread 00:07:02.058 ************************************ 00:07:02.058 13:35:16 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:02.058 13:35:16 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:02.058 13:35:16 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:02.058 13:35:16 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:02.058 13:35:16 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1457193 00:07:02.058 13:35:16 accel -- accel/accel.sh@63 -- # waitforlisten 1457193 00:07:02.058 13:35:16 accel -- common/autotest_common.sh@830 -- # '[' -z 1457193 ']' 00:07:02.058 13:35:16 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.058 13:35:16 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:02.058 13:35:16 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.058 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.058 13:35:16 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:02.058 13:35:16 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:02.058 13:35:16 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.058 13:35:16 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:02.058 13:35:16 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.058 13:35:16 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.058 13:35:16 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.058 13:35:16 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.058 13:35:16 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:02.058 13:35:16 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:02.058 13:35:16 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:02.058 13:35:16 accel -- accel/accel.sh@41 -- # jq -r . 00:07:02.058 [2024-06-10 13:35:16.326069] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:02.058 [2024-06-10 13:35:16.326124] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1457193 ] 00:07:02.058 [2024-06-10 13:35:16.416725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.058 [2024-06-10 13:35:16.493661] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.630 [2024-06-10 13:35:16.910668] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@863 -- # return 0 00:07:02.891 13:35:17 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:02.891 13:35:17 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:02.891 13:35:17 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:02.891 13:35:17 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:02.891 13:35:17 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:02.891 13:35:17 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.891 13:35:17 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:02.891 13:35:17 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:02.891 "method": "compressdev_scan_accel_module", 00:07:02.891 13:35:17 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:02.891 13:35:17 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:02.891 13:35:17 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.891 13:35:17 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # IFS== 00:07:03.152 13:35:17 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:03.152 13:35:17 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:03.152 13:35:17 accel -- accel/accel.sh@75 -- # killprocess 1457193 00:07:03.152 13:35:17 accel -- common/autotest_common.sh@949 -- # '[' -z 1457193 ']' 00:07:03.152 13:35:17 accel -- common/autotest_common.sh@953 -- # kill -0 1457193 00:07:03.152 13:35:17 accel -- common/autotest_common.sh@954 -- # uname 00:07:03.152 13:35:17 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:03.152 13:35:17 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1457193 00:07:03.152 13:35:17 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:03.153 13:35:17 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:03.153 13:35:17 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1457193' 00:07:03.153 killing process with pid 1457193 00:07:03.153 13:35:17 accel -- common/autotest_common.sh@968 -- # kill 1457193 00:07:03.153 13:35:17 accel -- common/autotest_common.sh@973 -- # wait 1457193 00:07:03.413 13:35:17 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:03.413 13:35:17 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:03.413 13:35:17 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:07:03.413 13:35:17 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:03.413 13:35:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:03.413 ************************************ 00:07:03.413 START TEST accel_cdev_comp 00:07:03.413 ************************************ 00:07:03.413 13:35:17 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:03.413 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:03.414 13:35:17 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:03.414 [2024-06-10 13:35:17.726562] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:03.414 [2024-06-10 13:35:17.726616] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1457537 ] 00:07:03.414 [2024-06-10 13:35:17.815757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.414 [2024-06-10 13:35:17.885558] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.986 [2024-06-10 13:35:18.302644] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:03.986 [2024-06-10 13:35:18.304512] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1173a10 PMD being used: compress_qat 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 [2024-06-10 13:35:18.307968] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13787b0 PMD being used: compress_qat 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:03.986 13:35:18 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:05.370 13:35:19 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:05.370 00:07:05.370 real 0m1.719s 00:07:05.370 user 0m1.420s 00:07:05.370 sys 0m0.299s 00:07:05.370 13:35:19 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:05.370 13:35:19 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:05.370 ************************************ 00:07:05.370 END TEST accel_cdev_comp 00:07:05.370 ************************************ 00:07:05.370 13:35:19 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:05.370 13:35:19 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:07:05.370 13:35:19 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:05.370 13:35:19 accel -- common/autotest_common.sh@10 -- # set +x 00:07:05.370 ************************************ 00:07:05.370 START TEST accel_cdev_decomp 00:07:05.370 ************************************ 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:05.370 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:05.371 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.371 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.371 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:05.371 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:05.371 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:05.371 13:35:19 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:05.371 [2024-06-10 13:35:19.519217] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:05.371 [2024-06-10 13:35:19.519278] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1457887 ] 00:07:05.371 [2024-06-10 13:35:19.608418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.371 [2024-06-10 13:35:19.673722] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.630 [2024-06-10 13:35:20.085965] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:05.630 [2024-06-10 13:35:20.087893] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x879a10 PMD being used: compress_qat 00:07:05.630 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.630 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.630 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.630 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:05.631 [2024-06-10 13:35:20.091533] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x87e750 PMD being used: compress_qat 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:05.631 13:35:20 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:07.014 00:07:07.014 real 0m1.709s 00:07:07.014 user 0m1.412s 00:07:07.014 sys 0m0.294s 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:07.014 13:35:21 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:07.014 ************************************ 00:07:07.014 END TEST accel_cdev_decomp 00:07:07.014 ************************************ 00:07:07.014 13:35:21 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.014 13:35:21 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:07.014 13:35:21 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:07.014 13:35:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:07.014 ************************************ 00:07:07.014 START TEST accel_cdev_decomp_full 00:07:07.014 ************************************ 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.014 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:07.015 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:07.015 [2024-06-10 13:35:21.306470] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:07.015 [2024-06-10 13:35:21.306524] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1458244 ] 00:07:07.015 [2024-06-10 13:35:21.395293] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.015 [2024-06-10 13:35:21.466513] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.584 [2024-06-10 13:35:21.876820] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:07.584 [2024-06-10 13:35:21.878709] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1531a10 PMD being used: compress_qat 00:07:07.584 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.584 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.584 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.584 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.584 [2024-06-10 13:35:21.881454] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1531670 PMD being used: compress_qat 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:07.585 13:35:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:08.526 00:07:08.526 real 0m1.712s 00:07:08.526 user 0m1.422s 00:07:08.526 sys 0m0.294s 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:08.526 13:35:22 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:08.526 ************************************ 00:07:08.526 END TEST accel_cdev_decomp_full 00:07:08.526 ************************************ 00:07:08.786 13:35:23 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.786 13:35:23 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:08.786 13:35:23 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:08.787 13:35:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:08.787 ************************************ 00:07:08.787 START TEST accel_cdev_decomp_mcore 00:07:08.787 ************************************ 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:08.787 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:08.787 [2024-06-10 13:35:23.095427] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:08.787 [2024-06-10 13:35:23.095484] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1458585 ] 00:07:08.787 [2024-06-10 13:35:23.186675] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:09.046 [2024-06-10 13:35:23.268355] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.046 [2024-06-10 13:35:23.268490] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:07:09.046 [2024-06-10 13:35:23.268647] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.046 [2024-06-10 13:35:23.268648] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 3 00:07:09.307 [2024-06-10 13:35:23.678620] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:09.307 [2024-06-10 13:35:23.680500] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf79060 PMD being used: compress_qat 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 [2024-06-10 13:35:23.685276] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f443819b8b0 PMD being used: compress_qat 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:09.307 [2024-06-10 13:35:23.685887] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f443019b8b0 PMD being used: compress_qat 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 [2024-06-10 13:35:23.687055] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf7e5c0 PMD being used: compress_qat 00:07:09.307 [2024-06-10 13:35:23.687151] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f442819b8b0 PMD being used: compress_qat 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:09.307 13:35:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.690 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.690 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.690 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:10.691 00:07:10.691 real 0m1.738s 00:07:10.691 user 0m5.837s 00:07:10.691 sys 0m0.307s 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:10.691 13:35:24 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:10.691 ************************************ 00:07:10.691 END TEST accel_cdev_decomp_mcore 00:07:10.691 ************************************ 00:07:10.691 13:35:24 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:10.691 13:35:24 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:10.691 13:35:24 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:10.691 13:35:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:10.691 ************************************ 00:07:10.691 START TEST accel_cdev_decomp_full_mcore 00:07:10.691 ************************************ 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:10.691 13:35:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:10.691 [2024-06-10 13:35:24.911598] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:10.691 [2024-06-10 13:35:24.911713] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1458940 ] 00:07:10.691 [2024-06-10 13:35:25.011247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:10.691 [2024-06-10 13:35:25.086635] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.691 [2024-06-10 13:35:25.086770] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:07:10.691 [2024-06-10 13:35:25.086927] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.691 [2024-06-10 13:35:25.086928] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 3 00:07:11.262 [2024-06-10 13:35:25.498650] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:11.262 [2024-06-10 13:35:25.500519] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2654060 PMD being used: compress_qat 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.262 [2024-06-10 13:35:25.504437] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2ef419b8b0 PMD being used: compress_qat 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:11.262 [2024-06-10 13:35:25.505110] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2eec19b8b0 PMD being used: compress_qat 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.262 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 [2024-06-10 13:35:25.506319] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2654100 PMD being used: compress_qat 00:07:11.263 [2024-06-10 13:35:25.506397] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2ee419b8b0 PMD being used: compress_qat 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:11.263 13:35:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:12.204 00:07:12.204 real 0m1.743s 00:07:12.204 user 0m5.835s 00:07:12.204 sys 0m0.315s 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:12.204 13:35:26 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:12.204 ************************************ 00:07:12.204 END TEST accel_cdev_decomp_full_mcore 00:07:12.204 ************************************ 00:07:12.204 13:35:26 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:12.204 13:35:26 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:12.204 13:35:26 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:12.204 13:35:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:12.464 ************************************ 00:07:12.464 START TEST accel_cdev_decomp_mthread 00:07:12.464 ************************************ 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:12.464 13:35:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:12.464 [2024-06-10 13:35:26.732712] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:12.464 [2024-06-10 13:35:26.732804] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1459297 ] 00:07:12.464 [2024-06-10 13:35:26.824809] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.464 [2024-06-10 13:35:26.896872] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.111 [2024-06-10 13:35:27.311180] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:13.111 [2024-06-10 13:35:27.313045] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x952a10 PMD being used: compress_qat 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 [2024-06-10 13:35:27.317191] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x957b00 PMD being used: compress_qat 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 [2024-06-10 13:35:27.319099] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa7a990 PMD being used: compress_qat 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:13.111 13:35:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.055 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:14.056 00:07:14.056 real 0m1.727s 00:07:14.056 user 0m1.416s 00:07:14.056 sys 0m0.314s 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:14.056 13:35:28 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:14.056 ************************************ 00:07:14.056 END TEST accel_cdev_decomp_mthread 00:07:14.056 ************************************ 00:07:14.056 13:35:28 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.056 13:35:28 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:14.056 13:35:28 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:14.056 13:35:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.056 ************************************ 00:07:14.056 START TEST accel_cdev_decomp_full_mthread 00:07:14.056 ************************************ 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:14.056 13:35:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:14.317 [2024-06-10 13:35:28.534883] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:14.317 [2024-06-10 13:35:28.534946] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1459652 ] 00:07:14.317 [2024-06-10 13:35:28.625172] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.317 [2024-06-10 13:35:28.700953] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.889 [2024-06-10 13:35:29.112444] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:14.889 [2024-06-10 13:35:29.114332] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x269fa10 PMD being used: compress_qat 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 [2024-06-10 13:35:29.117644] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x26a2d30 PMD being used: compress_qat 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.889 [2024-06-10 13:35:29.119684] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28a45d0 PMD being used: compress_qat 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.889 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:14.890 13:35:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.832 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:15.833 00:07:15.833 real 0m1.725s 00:07:15.833 user 0m1.417s 00:07:15.833 sys 0m0.311s 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:15.833 13:35:30 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:15.833 ************************************ 00:07:15.833 END TEST accel_cdev_decomp_full_mthread 00:07:15.833 ************************************ 00:07:15.833 13:35:30 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:15.833 13:35:30 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:15.833 13:35:30 accel -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:07:15.833 13:35:30 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:15.833 13:35:30 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:15.833 13:35:30 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.833 13:35:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:15.833 13:35:30 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.833 13:35:30 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.833 13:35:30 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.833 13:35:30 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.833 13:35:30 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:15.833 13:35:30 accel -- accel/accel.sh@41 -- # jq -r . 00:07:15.833 ************************************ 00:07:15.833 START TEST accel_dif_functional_tests 00:07:15.833 ************************************ 00:07:15.833 13:35:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:16.092 [2024-06-10 13:35:30.354405] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:16.092 [2024-06-10 13:35:30.354455] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1459999 ] 00:07:16.092 [2024-06-10 13:35:30.444315] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:16.092 [2024-06-10 13:35:30.514585] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.092 [2024-06-10 13:35:30.514717] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.092 [2024-06-10 13:35:30.514720] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.353 00:07:16.353 00:07:16.353 CUnit - A unit testing framework for C - Version 2.1-3 00:07:16.353 http://cunit.sourceforge.net/ 00:07:16.353 00:07:16.353 00:07:16.353 Suite: accel_dif 00:07:16.353 Test: verify: DIF generated, GUARD check ...passed 00:07:16.353 Test: verify: DIF generated, APPTAG check ...passed 00:07:16.353 Test: verify: DIF generated, REFTAG check ...passed 00:07:16.353 Test: verify: DIF not generated, GUARD check ...[2024-06-10 13:35:30.585061] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:16.353 passed 00:07:16.353 Test: verify: DIF not generated, APPTAG check ...[2024-06-10 13:35:30.585107] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:16.353 passed 00:07:16.353 Test: verify: DIF not generated, REFTAG check ...[2024-06-10 13:35:30.585130] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:16.353 passed 00:07:16.353 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:16.353 Test: verify: APPTAG incorrect, APPTAG check ...[2024-06-10 13:35:30.585185] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:16.353 passed 00:07:16.353 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:16.353 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:16.353 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:16.353 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-06-10 13:35:30.585308] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:16.353 passed 00:07:16.353 Test: verify copy: DIF generated, GUARD check ...passed 00:07:16.353 Test: verify copy: DIF generated, APPTAG check ...passed 00:07:16.353 Test: verify copy: DIF generated, REFTAG check ...passed 00:07:16.353 Test: verify copy: DIF not generated, GUARD check ...[2024-06-10 13:35:30.585432] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:16.353 passed 00:07:16.353 Test: verify copy: DIF not generated, APPTAG check ...[2024-06-10 13:35:30.585454] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:16.353 passed 00:07:16.353 Test: verify copy: DIF not generated, REFTAG check ...[2024-06-10 13:35:30.585476] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:16.353 passed 00:07:16.353 Test: generate copy: DIF generated, GUARD check ...passed 00:07:16.353 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:16.353 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:16.353 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:16.353 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:16.353 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:16.353 Test: generate copy: iovecs-len validate ...[2024-06-10 13:35:30.585664] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:16.353 passed 00:07:16.353 Test: generate copy: buffer alignment validate ...passed 00:07:16.353 00:07:16.353 Run Summary: Type Total Ran Passed Failed Inactive 00:07:16.353 suites 1 1 n/a 0 0 00:07:16.353 tests 26 26 26 0 0 00:07:16.353 asserts 115 115 115 0 n/a 00:07:16.353 00:07:16.353 Elapsed time = 0.002 seconds 00:07:16.353 00:07:16.353 real 0m0.404s 00:07:16.353 user 0m0.534s 00:07:16.353 sys 0m0.150s 00:07:16.353 13:35:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:16.353 13:35:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:16.353 ************************************ 00:07:16.353 END TEST accel_dif_functional_tests 00:07:16.353 ************************************ 00:07:16.353 00:07:16.353 real 0m45.317s 00:07:16.353 user 0m54.702s 00:07:16.353 sys 0m7.698s 00:07:16.353 13:35:30 accel -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:16.353 13:35:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:16.353 ************************************ 00:07:16.353 END TEST accel 00:07:16.353 ************************************ 00:07:16.353 13:35:30 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:16.353 13:35:30 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:16.353 13:35:30 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:16.353 13:35:30 -- common/autotest_common.sh@10 -- # set +x 00:07:16.353 ************************************ 00:07:16.353 START TEST accel_rpc 00:07:16.353 ************************************ 00:07:16.353 13:35:30 accel_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:16.614 * Looking for test storage... 00:07:16.614 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:16.614 13:35:30 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:16.614 13:35:30 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1460333 00:07:16.614 13:35:30 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1460333 00:07:16.614 13:35:30 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:16.614 13:35:30 accel_rpc -- common/autotest_common.sh@830 -- # '[' -z 1460333 ']' 00:07:16.614 13:35:30 accel_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.614 13:35:30 accel_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:16.614 13:35:30 accel_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.614 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.614 13:35:30 accel_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:16.614 13:35:30 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:16.614 [2024-06-10 13:35:30.996522] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:16.614 [2024-06-10 13:35:30.996572] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1460333 ] 00:07:16.614 [2024-06-10 13:35:31.068634] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.875 [2024-06-10 13:35:31.133719] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.446 13:35:31 accel_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:17.446 13:35:31 accel_rpc -- common/autotest_common.sh@863 -- # return 0 00:07:17.446 13:35:31 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:17.446 13:35:31 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:17.446 13:35:31 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:17.446 13:35:31 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:17.446 13:35:31 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:17.446 13:35:31 accel_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:17.446 13:35:31 accel_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:17.446 13:35:31 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.446 ************************************ 00:07:17.446 START TEST accel_assign_opcode 00:07:17.446 ************************************ 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # accel_assign_opcode_test_suite 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.446 [2024-06-10 13:35:31.871877] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.446 [2024-06-10 13:35:31.883900] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.446 13:35:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:17.708 software 00:07:17.708 00:07:17.708 real 0m0.232s 00:07:17.708 user 0m0.049s 00:07:17.708 sys 0m0.012s 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:17.708 13:35:32 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:17.708 ************************************ 00:07:17.708 END TEST accel_assign_opcode 00:07:17.708 ************************************ 00:07:17.708 13:35:32 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1460333 00:07:17.708 13:35:32 accel_rpc -- common/autotest_common.sh@949 -- # '[' -z 1460333 ']' 00:07:17.708 13:35:32 accel_rpc -- common/autotest_common.sh@953 -- # kill -0 1460333 00:07:17.708 13:35:32 accel_rpc -- common/autotest_common.sh@954 -- # uname 00:07:17.708 13:35:32 accel_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:17.708 13:35:32 accel_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1460333 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1460333' 00:07:17.969 killing process with pid 1460333 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@968 -- # kill 1460333 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@973 -- # wait 1460333 00:07:17.969 00:07:17.969 real 0m1.579s 00:07:17.969 user 0m1.721s 00:07:17.969 sys 0m0.413s 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:17.969 13:35:32 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:17.969 ************************************ 00:07:17.969 END TEST accel_rpc 00:07:17.969 ************************************ 00:07:17.969 13:35:32 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:17.969 13:35:32 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:17.969 13:35:32 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:17.969 13:35:32 -- common/autotest_common.sh@10 -- # set +x 00:07:18.231 ************************************ 00:07:18.231 START TEST app_cmdline 00:07:18.231 ************************************ 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:18.231 * Looking for test storage... 00:07:18.231 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:18.231 13:35:32 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:18.231 13:35:32 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1460709 00:07:18.231 13:35:32 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1460709 00:07:18.231 13:35:32 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@830 -- # '[' -z 1460709 ']' 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:18.231 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:18.231 13:35:32 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:18.231 [2024-06-10 13:35:32.639714] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:18.231 [2024-06-10 13:35:32.639778] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1460709 ] 00:07:18.491 [2024-06-10 13:35:32.730180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.491 [2024-06-10 13:35:32.801499] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.062 13:35:33 app_cmdline -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:19.062 13:35:33 app_cmdline -- common/autotest_common.sh@863 -- # return 0 00:07:19.062 13:35:33 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:19.324 { 00:07:19.324 "version": "SPDK v24.09-pre git sha1 c5b9f923d", 00:07:19.324 "fields": { 00:07:19.324 "major": 24, 00:07:19.324 "minor": 9, 00:07:19.324 "patch": 0, 00:07:19.324 "suffix": "-pre", 00:07:19.324 "commit": "c5b9f923d" 00:07:19.324 } 00:07:19.324 } 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:19.324 13:35:33 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@649 -- # local es=0 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:19.324 13:35:33 app_cmdline -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:19.585 request: 00:07:19.585 { 00:07:19.585 "method": "env_dpdk_get_mem_stats", 00:07:19.585 "req_id": 1 00:07:19.585 } 00:07:19.585 Got JSON-RPC error response 00:07:19.585 response: 00:07:19.585 { 00:07:19.585 "code": -32601, 00:07:19.585 "message": "Method not found" 00:07:19.585 } 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@652 -- # es=1 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:07:19.585 13:35:33 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1460709 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@949 -- # '[' -z 1460709 ']' 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@953 -- # kill -0 1460709 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@954 -- # uname 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:19.585 13:35:33 app_cmdline -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1460709 00:07:19.585 13:35:34 app_cmdline -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:19.585 13:35:34 app_cmdline -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:19.585 13:35:34 app_cmdline -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1460709' 00:07:19.585 killing process with pid 1460709 00:07:19.585 13:35:34 app_cmdline -- common/autotest_common.sh@968 -- # kill 1460709 00:07:19.585 13:35:34 app_cmdline -- common/autotest_common.sh@973 -- # wait 1460709 00:07:19.845 00:07:19.845 real 0m1.746s 00:07:19.845 user 0m2.177s 00:07:19.845 sys 0m0.450s 00:07:19.845 13:35:34 app_cmdline -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:19.845 13:35:34 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:19.845 ************************************ 00:07:19.845 END TEST app_cmdline 00:07:19.845 ************************************ 00:07:19.845 13:35:34 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:19.845 13:35:34 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:19.845 13:35:34 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:19.845 13:35:34 -- common/autotest_common.sh@10 -- # set +x 00:07:19.845 ************************************ 00:07:19.845 START TEST version 00:07:19.845 ************************************ 00:07:19.845 13:35:34 version -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:20.106 * Looking for test storage... 00:07:20.106 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:20.106 13:35:34 version -- app/version.sh@17 -- # get_header_version major 00:07:20.106 13:35:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # cut -f2 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.106 13:35:34 version -- app/version.sh@17 -- # major=24 00:07:20.106 13:35:34 version -- app/version.sh@18 -- # get_header_version minor 00:07:20.106 13:35:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # cut -f2 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.106 13:35:34 version -- app/version.sh@18 -- # minor=9 00:07:20.106 13:35:34 version -- app/version.sh@19 -- # get_header_version patch 00:07:20.106 13:35:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # cut -f2 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.106 13:35:34 version -- app/version.sh@19 -- # patch=0 00:07:20.106 13:35:34 version -- app/version.sh@20 -- # get_header_version suffix 00:07:20.106 13:35:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # cut -f2 00:07:20.106 13:35:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:20.106 13:35:34 version -- app/version.sh@20 -- # suffix=-pre 00:07:20.106 13:35:34 version -- app/version.sh@22 -- # version=24.9 00:07:20.106 13:35:34 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:20.106 13:35:34 version -- app/version.sh@28 -- # version=24.9rc0 00:07:20.106 13:35:34 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:20.106 13:35:34 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:20.106 13:35:34 version -- app/version.sh@30 -- # py_version=24.9rc0 00:07:20.106 13:35:34 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:07:20.106 00:07:20.106 real 0m0.183s 00:07:20.106 user 0m0.093s 00:07:20.106 sys 0m0.132s 00:07:20.106 13:35:34 version -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:20.106 13:35:34 version -- common/autotest_common.sh@10 -- # set +x 00:07:20.106 ************************************ 00:07:20.106 END TEST version 00:07:20.106 ************************************ 00:07:20.106 13:35:34 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:07:20.106 13:35:34 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:20.106 13:35:34 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:20.106 13:35:34 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:20.106 13:35:34 -- common/autotest_common.sh@10 -- # set +x 00:07:20.106 ************************************ 00:07:20.106 START TEST blockdev_general 00:07:20.106 ************************************ 00:07:20.106 13:35:34 blockdev_general -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:20.367 * Looking for test storage... 00:07:20.367 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:20.367 13:35:34 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1461239 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1461239 00:07:20.367 13:35:34 blockdev_general -- common/autotest_common.sh@830 -- # '[' -z 1461239 ']' 00:07:20.367 13:35:34 blockdev_general -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:20.367 13:35:34 blockdev_general -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:20.367 13:35:34 blockdev_general -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:20.367 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:20.367 13:35:34 blockdev_general -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:20.367 13:35:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:20.367 13:35:34 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:20.367 [2024-06-10 13:35:34.710781] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:20.367 [2024-06-10 13:35:34.710830] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1461239 ] 00:07:20.367 [2024-06-10 13:35:34.804399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.627 [2024-06-10 13:35:34.870212] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.198 13:35:35 blockdev_general -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:21.198 13:35:35 blockdev_general -- common/autotest_common.sh@863 -- # return 0 00:07:21.198 13:35:35 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:07:21.198 13:35:35 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:07:21.198 13:35:35 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:21.198 13:35:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.198 13:35:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.458 [2024-06-10 13:35:35.743886] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:21.458 [2024-06-10 13:35:35.743926] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:21.458 00:07:21.458 [2024-06-10 13:35:35.751876] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:21.458 [2024-06-10 13:35:35.751893] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:21.458 00:07:21.458 Malloc0 00:07:21.458 Malloc1 00:07:21.458 Malloc2 00:07:21.458 Malloc3 00:07:21.458 Malloc4 00:07:21.458 Malloc5 00:07:21.458 Malloc6 00:07:21.458 Malloc7 00:07:21.458 Malloc8 00:07:21.458 Malloc9 00:07:21.458 [2024-06-10 13:35:35.863585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:21.458 [2024-06-10 13:35:35.863620] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:21.458 [2024-06-10 13:35:35.863632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f03340 00:07:21.458 [2024-06-10 13:35:35.863639] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:21.458 [2024-06-10 13:35:35.864846] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:21.458 [2024-06-10 13:35:35.864865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:21.458 TestPT 00:07:21.458 13:35:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.458 13:35:35 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:21.458 5000+0 records in 00:07:21.458 5000+0 records out 00:07:21.458 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0157583 s, 650 MB/s 00:07:21.458 13:35:35 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:21.458 13:35:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.458 13:35:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.719 AIO0 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.719 13:35:35 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.719 13:35:35 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:07:21.719 13:35:35 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.719 13:35:35 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.719 13:35:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.719 13:35:36 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.719 13:35:36 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:07:21.719 13:35:36 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:07:21.719 13:35:36 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:21.719 13:35:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:21.719 13:35:36 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:07:21.719 13:35:36 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:07:21.720 13:35:36 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "605f33ca-2a30-4602-8efd-ef62ebee6504"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "605f33ca-2a30-4602-8efd-ef62ebee6504",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "aa879bea-8db6-5409-8436-ca3a28ad5ecb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "aa879bea-8db6-5409-8436-ca3a28ad5ecb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "7e0d8304-904a-5c1b-b76b-1a7cd55ac899"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7e0d8304-904a-5c1b-b76b-1a7cd55ac899",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "04b8b342-71f1-5236-b876-729933fd5568"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "04b8b342-71f1-5236-b876-729933fd5568",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "101d4554-9b87-583f-9dd6-f2b02515ddbe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "101d4554-9b87-583f-9dd6-f2b02515ddbe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "a8ebe4a8-6014-5838-ae63-8db4848f2dd6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a8ebe4a8-6014-5838-ae63-8db4848f2dd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "871f88f9-37d1-56b4-822e-bc9457501387"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "871f88f9-37d1-56b4-822e-bc9457501387",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "008634fc-1d7d-5583-b43b-282521449004"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "008634fc-1d7d-5583-b43b-282521449004",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "83be7e2a-bcf2-5e3f-b91c-de9c94239cf8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "83be7e2a-bcf2-5e3f-b91c-de9c94239cf8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "47b4e6f9-9632-5fca-bfb0-25b8cb44e762"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "47b4e6f9-9632-5fca-bfb0-25b8cb44e762",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "bb20bb00-2537-502c-8b05-cb511c04f658"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb20bb00-2537-502c-8b05-cb511c04f658",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e1e337c0-bc1f-5bde-ba13-29d711804bf2"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e1e337c0-bc1f-5bde-ba13-29d711804bf2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "f60adfff-c170-491a-a1b2-16090cd578fb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f60adfff-c170-491a-a1b2-16090cd578fb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f60adfff-c170-491a-a1b2-16090cd578fb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bd0b6634-6f04-467c-8725-d81bd7278549",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d48d1ea7-91bc-4dd7-8831-a53ae4b6e493",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "1f947de7-adbe-400d-8ae5-b3cec593608d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "1f947de7-adbe-400d-8ae5-b3cec593608d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "1f947de7-adbe-400d-8ae5-b3cec593608d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "45d6aa97-ac74-477d-91e7-60ab84604d76",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "34d08a68-9f2e-4b15-b85b-210f4364e65a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "78e4a155-e00d-4eab-bad9-1597a8870850",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "fbdc7dcd-953d-4fd2-aa37-55fe0c6bd9bd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "88ba31ec-9450-4ca0-8ac3-5212781ba869"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "88ba31ec-9450-4ca0-8ac3-5212781ba869",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:21.720 13:35:36 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:07:21.720 13:35:36 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:07:21.720 13:35:36 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:07:21.720 13:35:36 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 1461239 00:07:21.720 13:35:36 blockdev_general -- common/autotest_common.sh@949 -- # '[' -z 1461239 ']' 00:07:21.720 13:35:36 blockdev_general -- common/autotest_common.sh@953 -- # kill -0 1461239 00:07:21.720 13:35:36 blockdev_general -- common/autotest_common.sh@954 -- # uname 00:07:21.720 13:35:36 blockdev_general -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:21.720 13:35:36 blockdev_general -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1461239 00:07:21.981 13:35:36 blockdev_general -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:21.981 13:35:36 blockdev_general -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:21.981 13:35:36 blockdev_general -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1461239' 00:07:21.981 killing process with pid 1461239 00:07:21.981 13:35:36 blockdev_general -- common/autotest_common.sh@968 -- # kill 1461239 00:07:21.981 13:35:36 blockdev_general -- common/autotest_common.sh@973 -- # wait 1461239 00:07:22.242 13:35:36 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:22.242 13:35:36 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:22.242 13:35:36 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:22.242 13:35:36 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:22.242 13:35:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:22.242 ************************************ 00:07:22.242 START TEST bdev_hello_world 00:07:22.242 ************************************ 00:07:22.242 13:35:36 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:22.242 [2024-06-10 13:35:36.587874] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:22.242 [2024-06-10 13:35:36.587918] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1461600 ] 00:07:22.242 [2024-06-10 13:35:36.676436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.504 [2024-06-10 13:35:36.741349] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.504 [2024-06-10 13:35:36.860663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:22.504 [2024-06-10 13:35:36.860704] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:22.504 [2024-06-10 13:35:36.860713] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:22.504 [2024-06-10 13:35:36.868670] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:22.504 [2024-06-10 13:35:36.868691] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:22.504 [2024-06-10 13:35:36.876679] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:22.504 [2024-06-10 13:35:36.876696] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:22.504 [2024-06-10 13:35:36.938617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:22.504 [2024-06-10 13:35:36.938656] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:22.504 [2024-06-10 13:35:36.938667] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3cb30 00:07:22.504 [2024-06-10 13:35:36.938673] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:22.504 [2024-06-10 13:35:36.940020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:22.504 [2024-06-10 13:35:36.940041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:22.765 [2024-06-10 13:35:37.061487] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:22.765 [2024-06-10 13:35:37.061519] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:22.765 [2024-06-10 13:35:37.061541] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:22.765 [2024-06-10 13:35:37.061575] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:22.765 [2024-06-10 13:35:37.061611] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:22.765 [2024-06-10 13:35:37.061620] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:22.765 [2024-06-10 13:35:37.061646] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:22.765 00:07:22.765 [2024-06-10 13:35:37.061661] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:23.026 00:07:23.026 real 0m0.714s 00:07:23.026 user 0m0.487s 00:07:23.026 sys 0m0.199s 00:07:23.026 13:35:37 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:23.026 13:35:37 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:23.026 ************************************ 00:07:23.026 END TEST bdev_hello_world 00:07:23.026 ************************************ 00:07:23.026 13:35:37 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:07:23.026 13:35:37 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:07:23.026 13:35:37 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:23.026 13:35:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:23.026 ************************************ 00:07:23.026 START TEST bdev_bounds 00:07:23.026 ************************************ 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1461701 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1461701' 00:07:23.026 Process bdevio pid: 1461701 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1461701 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1461701 ']' 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:23.026 13:35:37 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:23.026 [2024-06-10 13:35:37.372082] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:23.026 [2024-06-10 13:35:37.372131] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1461701 ] 00:07:23.026 [2024-06-10 13:35:37.460082] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:23.287 [2024-06-10 13:35:37.526343] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.287 [2024-06-10 13:35:37.526365] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:07:23.287 [2024-06-10 13:35:37.526367] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.287 [2024-06-10 13:35:37.651700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:23.287 [2024-06-10 13:35:37.651748] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:23.287 [2024-06-10 13:35:37.651760] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:23.287 [2024-06-10 13:35:37.659710] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:23.287 [2024-06-10 13:35:37.659730] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:23.287 [2024-06-10 13:35:37.667722] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:23.287 [2024-06-10 13:35:37.667739] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:23.287 [2024-06-10 13:35:37.729886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:23.287 [2024-06-10 13:35:37.729926] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:23.287 [2024-06-10 13:35:37.729936] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2692d90 00:07:23.287 [2024-06-10 13:35:37.729943] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:23.287 [2024-06-10 13:35:37.731221] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:23.287 [2024-06-10 13:35:37.731241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:23.858 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:23.859 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:07:23.859 13:35:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:23.859 I/O targets: 00:07:23.859 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:23.859 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:23.859 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:23.859 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:23.859 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:23.859 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:23.859 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:23.859 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:23.859 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:23.859 00:07:23.859 00:07:23.859 CUnit - A unit testing framework for C - Version 2.1-3 00:07:23.859 http://cunit.sourceforge.net/ 00:07:23.859 00:07:23.859 00:07:23.859 Suite: bdevio tests on: AIO0 00:07:23.859 Test: blockdev write read block ...passed 00:07:23.859 Test: blockdev write zeroes read block ...passed 00:07:23.859 Test: blockdev write zeroes read no split ...passed 00:07:23.859 Test: blockdev write zeroes read split ...passed 00:07:23.859 Test: blockdev write zeroes read split partial ...passed 00:07:23.859 Test: blockdev reset ...passed 00:07:23.859 Test: blockdev write read 8 blocks ...passed 00:07:23.859 Test: blockdev write read size > 128k ...passed 00:07:23.859 Test: blockdev write read invalid size ...passed 00:07:23.859 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:23.859 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:23.859 Test: blockdev write read max offset ...passed 00:07:23.859 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:23.859 Test: blockdev writev readv 8 blocks ...passed 00:07:23.859 Test: blockdev writev readv 30 x 1block ...passed 00:07:23.859 Test: blockdev writev readv block ...passed 00:07:23.859 Test: blockdev writev readv size > 128k ...passed 00:07:23.859 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:23.859 Test: blockdev comparev and writev ...passed 00:07:23.859 Test: blockdev nvme passthru rw ...passed 00:07:23.859 Test: blockdev nvme passthru vendor specific ...passed 00:07:23.859 Test: blockdev nvme admin passthru ...passed 00:07:23.859 Test: blockdev copy ...passed 00:07:23.859 Suite: bdevio tests on: raid1 00:07:23.859 Test: blockdev write read block ...passed 00:07:23.859 Test: blockdev write zeroes read block ...passed 00:07:23.859 Test: blockdev write zeroes read no split ...passed 00:07:24.121 Test: blockdev write zeroes read split ...passed 00:07:24.121 Test: blockdev write zeroes read split partial ...passed 00:07:24.121 Test: blockdev reset ...passed 00:07:24.121 Test: blockdev write read 8 blocks ...passed 00:07:24.121 Test: blockdev write read size > 128k ...passed 00:07:24.121 Test: blockdev write read invalid size ...passed 00:07:24.121 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.121 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.121 Test: blockdev write read max offset ...passed 00:07:24.121 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.121 Test: blockdev writev readv 8 blocks ...passed 00:07:24.121 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.121 Test: blockdev writev readv block ...passed 00:07:24.121 Test: blockdev writev readv size > 128k ...passed 00:07:24.121 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.121 Test: blockdev comparev and writev ...passed 00:07:24.121 Test: blockdev nvme passthru rw ...passed 00:07:24.121 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.121 Test: blockdev nvme admin passthru ...passed 00:07:24.121 Test: blockdev copy ...passed 00:07:24.121 Suite: bdevio tests on: concat0 00:07:24.121 Test: blockdev write read block ...passed 00:07:24.121 Test: blockdev write zeroes read block ...passed 00:07:24.121 Test: blockdev write zeroes read no split ...passed 00:07:24.121 Test: blockdev write zeroes read split ...passed 00:07:24.121 Test: blockdev write zeroes read split partial ...passed 00:07:24.121 Test: blockdev reset ...passed 00:07:24.121 Test: blockdev write read 8 blocks ...passed 00:07:24.121 Test: blockdev write read size > 128k ...passed 00:07:24.121 Test: blockdev write read invalid size ...passed 00:07:24.121 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.121 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.121 Test: blockdev write read max offset ...passed 00:07:24.121 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.121 Test: blockdev writev readv 8 blocks ...passed 00:07:24.121 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.121 Test: blockdev writev readv block ...passed 00:07:24.121 Test: blockdev writev readv size > 128k ...passed 00:07:24.121 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.121 Test: blockdev comparev and writev ...passed 00:07:24.121 Test: blockdev nvme passthru rw ...passed 00:07:24.121 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.121 Test: blockdev nvme admin passthru ...passed 00:07:24.121 Test: blockdev copy ...passed 00:07:24.121 Suite: bdevio tests on: raid0 00:07:24.121 Test: blockdev write read block ...passed 00:07:24.121 Test: blockdev write zeroes read block ...passed 00:07:24.121 Test: blockdev write zeroes read no split ...passed 00:07:24.121 Test: blockdev write zeroes read split ...passed 00:07:24.121 Test: blockdev write zeroes read split partial ...passed 00:07:24.121 Test: blockdev reset ...passed 00:07:24.121 Test: blockdev write read 8 blocks ...passed 00:07:24.121 Test: blockdev write read size > 128k ...passed 00:07:24.121 Test: blockdev write read invalid size ...passed 00:07:24.121 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: TestPT 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: Malloc2p7 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: Malloc2p6 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: Malloc2p5 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: Malloc2p4 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: Malloc2p3 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.122 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.122 Test: blockdev nvme admin passthru ...passed 00:07:24.122 Test: blockdev copy ...passed 00:07:24.122 Suite: bdevio tests on: Malloc2p2 00:07:24.122 Test: blockdev write read block ...passed 00:07:24.122 Test: blockdev write zeroes read block ...passed 00:07:24.122 Test: blockdev write zeroes read no split ...passed 00:07:24.122 Test: blockdev write zeroes read split ...passed 00:07:24.122 Test: blockdev write zeroes read split partial ...passed 00:07:24.122 Test: blockdev reset ...passed 00:07:24.122 Test: blockdev write read 8 blocks ...passed 00:07:24.122 Test: blockdev write read size > 128k ...passed 00:07:24.122 Test: blockdev write read invalid size ...passed 00:07:24.122 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.122 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.122 Test: blockdev write read max offset ...passed 00:07:24.122 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.122 Test: blockdev writev readv 8 blocks ...passed 00:07:24.122 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.122 Test: blockdev writev readv block ...passed 00:07:24.122 Test: blockdev writev readv size > 128k ...passed 00:07:24.122 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.122 Test: blockdev comparev and writev ...passed 00:07:24.122 Test: blockdev nvme passthru rw ...passed 00:07:24.123 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.123 Test: blockdev nvme admin passthru ...passed 00:07:24.123 Test: blockdev copy ...passed 00:07:24.123 Suite: bdevio tests on: Malloc2p1 00:07:24.123 Test: blockdev write read block ...passed 00:07:24.123 Test: blockdev write zeroes read block ...passed 00:07:24.123 Test: blockdev write zeroes read no split ...passed 00:07:24.123 Test: blockdev write zeroes read split ...passed 00:07:24.123 Test: blockdev write zeroes read split partial ...passed 00:07:24.123 Test: blockdev reset ...passed 00:07:24.123 Test: blockdev write read 8 blocks ...passed 00:07:24.123 Test: blockdev write read size > 128k ...passed 00:07:24.123 Test: blockdev write read invalid size ...passed 00:07:24.123 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.123 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.123 Test: blockdev write read max offset ...passed 00:07:24.123 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.123 Test: blockdev writev readv 8 blocks ...passed 00:07:24.123 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.123 Test: blockdev writev readv block ...passed 00:07:24.123 Test: blockdev writev readv size > 128k ...passed 00:07:24.123 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.123 Test: blockdev comparev and writev ...passed 00:07:24.123 Test: blockdev nvme passthru rw ...passed 00:07:24.123 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.123 Test: blockdev nvme admin passthru ...passed 00:07:24.123 Test: blockdev copy ...passed 00:07:24.123 Suite: bdevio tests on: Malloc2p0 00:07:24.123 Test: blockdev write read block ...passed 00:07:24.123 Test: blockdev write zeroes read block ...passed 00:07:24.123 Test: blockdev write zeroes read no split ...passed 00:07:24.123 Test: blockdev write zeroes read split ...passed 00:07:24.123 Test: blockdev write zeroes read split partial ...passed 00:07:24.123 Test: blockdev reset ...passed 00:07:24.123 Test: blockdev write read 8 blocks ...passed 00:07:24.123 Test: blockdev write read size > 128k ...passed 00:07:24.123 Test: blockdev write read invalid size ...passed 00:07:24.123 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.123 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.123 Test: blockdev write read max offset ...passed 00:07:24.123 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.123 Test: blockdev writev readv 8 blocks ...passed 00:07:24.123 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.123 Test: blockdev writev readv block ...passed 00:07:24.123 Test: blockdev writev readv size > 128k ...passed 00:07:24.123 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.123 Test: blockdev comparev and writev ...passed 00:07:24.123 Test: blockdev nvme passthru rw ...passed 00:07:24.123 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.123 Test: blockdev nvme admin passthru ...passed 00:07:24.123 Test: blockdev copy ...passed 00:07:24.123 Suite: bdevio tests on: Malloc1p1 00:07:24.123 Test: blockdev write read block ...passed 00:07:24.123 Test: blockdev write zeroes read block ...passed 00:07:24.123 Test: blockdev write zeroes read no split ...passed 00:07:24.123 Test: blockdev write zeroes read split ...passed 00:07:24.123 Test: blockdev write zeroes read split partial ...passed 00:07:24.123 Test: blockdev reset ...passed 00:07:24.123 Test: blockdev write read 8 blocks ...passed 00:07:24.123 Test: blockdev write read size > 128k ...passed 00:07:24.123 Test: blockdev write read invalid size ...passed 00:07:24.123 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.123 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.123 Test: blockdev write read max offset ...passed 00:07:24.123 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.123 Test: blockdev writev readv 8 blocks ...passed 00:07:24.123 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.123 Test: blockdev writev readv block ...passed 00:07:24.123 Test: blockdev writev readv size > 128k ...passed 00:07:24.123 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.123 Test: blockdev comparev and writev ...passed 00:07:24.123 Test: blockdev nvme passthru rw ...passed 00:07:24.123 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.123 Test: blockdev nvme admin passthru ...passed 00:07:24.123 Test: blockdev copy ...passed 00:07:24.123 Suite: bdevio tests on: Malloc1p0 00:07:24.123 Test: blockdev write read block ...passed 00:07:24.123 Test: blockdev write zeroes read block ...passed 00:07:24.123 Test: blockdev write zeroes read no split ...passed 00:07:24.123 Test: blockdev write zeroes read split ...passed 00:07:24.123 Test: blockdev write zeroes read split partial ...passed 00:07:24.123 Test: blockdev reset ...passed 00:07:24.123 Test: blockdev write read 8 blocks ...passed 00:07:24.123 Test: blockdev write read size > 128k ...passed 00:07:24.123 Test: blockdev write read invalid size ...passed 00:07:24.123 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.123 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.123 Test: blockdev write read max offset ...passed 00:07:24.123 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.123 Test: blockdev writev readv 8 blocks ...passed 00:07:24.123 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.123 Test: blockdev writev readv block ...passed 00:07:24.123 Test: blockdev writev readv size > 128k ...passed 00:07:24.123 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.123 Test: blockdev comparev and writev ...passed 00:07:24.123 Test: blockdev nvme passthru rw ...passed 00:07:24.123 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.123 Test: blockdev nvme admin passthru ...passed 00:07:24.123 Test: blockdev copy ...passed 00:07:24.123 Suite: bdevio tests on: Malloc0 00:07:24.123 Test: blockdev write read block ...passed 00:07:24.123 Test: blockdev write zeroes read block ...passed 00:07:24.123 Test: blockdev write zeroes read no split ...passed 00:07:24.123 Test: blockdev write zeroes read split ...passed 00:07:24.123 Test: blockdev write zeroes read split partial ...passed 00:07:24.123 Test: blockdev reset ...passed 00:07:24.123 Test: blockdev write read 8 blocks ...passed 00:07:24.123 Test: blockdev write read size > 128k ...passed 00:07:24.123 Test: blockdev write read invalid size ...passed 00:07:24.123 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:24.123 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:24.123 Test: blockdev write read max offset ...passed 00:07:24.123 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:24.123 Test: blockdev writev readv 8 blocks ...passed 00:07:24.123 Test: blockdev writev readv 30 x 1block ...passed 00:07:24.123 Test: blockdev writev readv block ...passed 00:07:24.123 Test: blockdev writev readv size > 128k ...passed 00:07:24.123 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:24.123 Test: blockdev comparev and writev ...passed 00:07:24.123 Test: blockdev nvme passthru rw ...passed 00:07:24.123 Test: blockdev nvme passthru vendor specific ...passed 00:07:24.123 Test: blockdev nvme admin passthru ...passed 00:07:24.123 Test: blockdev copy ...passed 00:07:24.123 00:07:24.123 Run Summary: Type Total Ran Passed Failed Inactive 00:07:24.123 suites 16 16 n/a 0 0 00:07:24.123 tests 368 368 368 0 0 00:07:24.123 asserts 2224 2224 2224 0 n/a 00:07:24.123 00:07:24.123 Elapsed time = 0.449 seconds 00:07:24.123 0 00:07:24.123 13:35:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1461701 00:07:24.123 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1461701 ']' 00:07:24.123 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1461701 00:07:24.123 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:07:24.123 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:24.123 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1461701 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1461701' 00:07:24.386 killing process with pid 1461701 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1461701 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1461701 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:07:24.386 00:07:24.386 real 0m1.455s 00:07:24.386 user 0m3.845s 00:07:24.386 sys 0m0.339s 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:24.386 13:35:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:24.386 ************************************ 00:07:24.386 END TEST bdev_bounds 00:07:24.386 ************************************ 00:07:24.386 13:35:38 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:24.386 13:35:38 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:07:24.386 13:35:38 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:24.386 13:35:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:24.386 ************************************ 00:07:24.386 START TEST bdev_nbd 00:07:24.386 ************************************ 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1462001 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1462001 /var/tmp/spdk-nbd.sock 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1462001 ']' 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:24.386 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:24.387 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:24.387 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:24.387 13:35:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:24.648 [2024-06-10 13:35:38.907285] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:07:24.648 [2024-06-10 13:35:38.907333] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:24.648 [2024-06-10 13:35:38.997617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.648 [2024-06-10 13:35:39.074638] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.909 [2024-06-10 13:35:39.200508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:24.909 [2024-06-10 13:35:39.200555] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:24.909 [2024-06-10 13:35:39.200563] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:24.909 [2024-06-10 13:35:39.208517] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:24.909 [2024-06-10 13:35:39.208536] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:24.909 [2024-06-10 13:35:39.216528] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:24.909 [2024-06-10 13:35:39.216546] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:24.909 [2024-06-10 13:35:39.278583] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:24.909 [2024-06-10 13:35:39.278616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:24.909 [2024-06-10 13:35:39.278626] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x258e0f0 00:07:24.909 [2024-06-10 13:35:39.278632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:24.909 [2024-06-10 13:35:39.279906] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:24.909 [2024-06-10 13:35:39.279926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:25.482 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:25.743 1+0 records in 00:07:25.743 1+0 records out 00:07:25.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000293457 s, 14.0 MB/s 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:25.743 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:25.744 13:35:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:25.744 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.005 1+0 records in 00:07:26.005 1+0 records out 00:07:26.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312423 s, 13.1 MB/s 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.005 1+0 records in 00:07:26.005 1+0 records out 00:07:26.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186234 s, 22.0 MB/s 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.005 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.266 1+0 records in 00:07:26.266 1+0 records out 00:07:26.266 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187579 s, 21.8 MB/s 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.266 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.527 1+0 records in 00:07:26.527 1+0 records out 00:07:26.527 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236311 s, 17.3 MB/s 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.527 13:35:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:26.787 1+0 records in 00:07:26.787 1+0 records out 00:07:26.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291237 s, 14.1 MB/s 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:26.787 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:27.047 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.047 1+0 records in 00:07:27.047 1+0 records out 00:07:27.048 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309226 s, 13.2 MB/s 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.048 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.308 1+0 records in 00:07:27.308 1+0 records out 00:07:27.308 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000320029 s, 12.8 MB/s 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.308 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.568 1+0 records in 00:07:27.568 1+0 records out 00:07:27.568 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311355 s, 13.2 MB/s 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.568 13:35:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:27.828 1+0 records in 00:07:27.828 1+0 records out 00:07:27.828 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341447 s, 12.0 MB/s 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:27.828 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.087 1+0 records in 00:07:28.087 1+0 records out 00:07:28.087 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000384068 s, 10.7 MB/s 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.087 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.347 1+0 records in 00:07:28.347 1+0 records out 00:07:28.347 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000441187 s, 9.3 MB/s 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.347 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.608 1+0 records in 00:07:28.608 1+0 records out 00:07:28.608 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000463716 s, 8.8 MB/s 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.608 13:35:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:07:28.609 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:07:28.609 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.869 1+0 records in 00:07:28.869 1+0 records out 00:07:28.869 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000437236 s, 9.4 MB/s 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:28.869 1+0 records in 00:07:28.869 1+0 records out 00:07:28.869 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000424825 s, 9.6 MB/s 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:28.869 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:29.129 1+0 records in 00:07:29.129 1+0 records out 00:07:29.129 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000375009 s, 10.9 MB/s 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:29.129 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:29.389 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd0", 00:07:29.390 "bdev_name": "Malloc0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd1", 00:07:29.390 "bdev_name": "Malloc1p0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd2", 00:07:29.390 "bdev_name": "Malloc1p1" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd3", 00:07:29.390 "bdev_name": "Malloc2p0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd4", 00:07:29.390 "bdev_name": "Malloc2p1" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd5", 00:07:29.390 "bdev_name": "Malloc2p2" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd6", 00:07:29.390 "bdev_name": "Malloc2p3" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd7", 00:07:29.390 "bdev_name": "Malloc2p4" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd8", 00:07:29.390 "bdev_name": "Malloc2p5" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd9", 00:07:29.390 "bdev_name": "Malloc2p6" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd10", 00:07:29.390 "bdev_name": "Malloc2p7" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd11", 00:07:29.390 "bdev_name": "TestPT" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd12", 00:07:29.390 "bdev_name": "raid0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd13", 00:07:29.390 "bdev_name": "concat0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd14", 00:07:29.390 "bdev_name": "raid1" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd15", 00:07:29.390 "bdev_name": "AIO0" 00:07:29.390 } 00:07:29.390 ]' 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd0", 00:07:29.390 "bdev_name": "Malloc0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd1", 00:07:29.390 "bdev_name": "Malloc1p0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd2", 00:07:29.390 "bdev_name": "Malloc1p1" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd3", 00:07:29.390 "bdev_name": "Malloc2p0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd4", 00:07:29.390 "bdev_name": "Malloc2p1" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd5", 00:07:29.390 "bdev_name": "Malloc2p2" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd6", 00:07:29.390 "bdev_name": "Malloc2p3" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd7", 00:07:29.390 "bdev_name": "Malloc2p4" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd8", 00:07:29.390 "bdev_name": "Malloc2p5" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd9", 00:07:29.390 "bdev_name": "Malloc2p6" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd10", 00:07:29.390 "bdev_name": "Malloc2p7" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd11", 00:07:29.390 "bdev_name": "TestPT" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd12", 00:07:29.390 "bdev_name": "raid0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd13", 00:07:29.390 "bdev_name": "concat0" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd14", 00:07:29.390 "bdev_name": "raid1" 00:07:29.390 }, 00:07:29.390 { 00:07:29.390 "nbd_device": "/dev/nbd15", 00:07:29.390 "bdev_name": "AIO0" 00:07:29.390 } 00:07:29.390 ]' 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.390 13:35:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.652 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.912 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.173 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.433 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:30.694 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.694 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.694 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.694 13:35:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.694 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.954 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.215 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.476 13:35:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.738 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:31.999 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:32.260 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:32.521 13:35:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:32.782 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:33.043 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.304 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:33.565 /dev/nbd0 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.565 1+0 records in 00:07:33.565 1+0 records out 00:07:33.565 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264861 s, 15.5 MB/s 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.565 13:35:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:07:33.825 /dev/nbd1 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:33.825 1+0 records in 00:07:33.825 1+0 records out 00:07:33.825 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287052 s, 14.3 MB/s 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:33.825 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:07:34.086 /dev/nbd10 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.086 1+0 records in 00:07:34.086 1+0 records out 00:07:34.086 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311276 s, 13.2 MB/s 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.086 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:07:34.348 /dev/nbd11 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.348 1+0 records in 00:07:34.348 1+0 records out 00:07:34.348 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000197683 s, 20.7 MB/s 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.348 13:35:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:07:34.610 /dev/nbd12 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.610 1+0 records in 00:07:34.610 1+0 records out 00:07:34.610 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341476 s, 12.0 MB/s 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.610 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:07:34.872 /dev/nbd13 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:34.872 1+0 records in 00:07:34.872 1+0 records out 00:07:34.872 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000347991 s, 11.8 MB/s 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:34.872 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:07:35.132 /dev/nbd14 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.132 1+0 records in 00:07:35.132 1+0 records out 00:07:35.132 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000386149 s, 10.6 MB/s 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.132 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:07:35.392 /dev/nbd15 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.392 1+0 records in 00:07:35.392 1+0 records out 00:07:35.392 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000422027 s, 9.7 MB/s 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.392 13:35:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:07:35.653 /dev/nbd2 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.653 1+0 records in 00:07:35.653 1+0 records out 00:07:35.653 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000190965 s, 21.4 MB/s 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.653 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:07:35.915 /dev/nbd3 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.915 1+0 records in 00:07:35.915 1+0 records out 00:07:35.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000455512 s, 9.0 MB/s 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:35.915 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:07:36.176 /dev/nbd4 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.176 1+0 records in 00:07:36.176 1+0 records out 00:07:36.176 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000517612 s, 7.9 MB/s 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:36.176 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:07:36.437 /dev/nbd5 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.437 1+0 records in 00:07:36.437 1+0 records out 00:07:36.437 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000433288 s, 9.5 MB/s 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:36.437 13:35:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:07:36.698 /dev/nbd6 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.698 1+0 records in 00:07:36.698 1+0 records out 00:07:36.698 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000512064 s, 8.0 MB/s 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:36.698 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:07:36.960 /dev/nbd7 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.960 1+0 records in 00:07:36.960 1+0 records out 00:07:36.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000519096 s, 7.9 MB/s 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:36.960 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:07:37.222 /dev/nbd8 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:37.222 1+0 records in 00:07:37.222 1+0 records out 00:07:37.222 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000473664 s, 8.6 MB/s 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:37.222 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:07:37.485 /dev/nbd9 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:37.485 1+0 records in 00:07:37.485 1+0 records out 00:07:37.485 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000488394 s, 8.4 MB/s 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.485 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:37.746 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd0", 00:07:37.746 "bdev_name": "Malloc0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd1", 00:07:37.746 "bdev_name": "Malloc1p0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd10", 00:07:37.746 "bdev_name": "Malloc1p1" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd11", 00:07:37.746 "bdev_name": "Malloc2p0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd12", 00:07:37.746 "bdev_name": "Malloc2p1" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd13", 00:07:37.746 "bdev_name": "Malloc2p2" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd14", 00:07:37.746 "bdev_name": "Malloc2p3" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd15", 00:07:37.746 "bdev_name": "Malloc2p4" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd2", 00:07:37.746 "bdev_name": "Malloc2p5" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd3", 00:07:37.746 "bdev_name": "Malloc2p6" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd4", 00:07:37.746 "bdev_name": "Malloc2p7" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd5", 00:07:37.746 "bdev_name": "TestPT" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd6", 00:07:37.746 "bdev_name": "raid0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd7", 00:07:37.746 "bdev_name": "concat0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd8", 00:07:37.746 "bdev_name": "raid1" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd9", 00:07:37.746 "bdev_name": "AIO0" 00:07:37.746 } 00:07:37.746 ]' 00:07:37.746 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd0", 00:07:37.746 "bdev_name": "Malloc0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd1", 00:07:37.746 "bdev_name": "Malloc1p0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd10", 00:07:37.746 "bdev_name": "Malloc1p1" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd11", 00:07:37.746 "bdev_name": "Malloc2p0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd12", 00:07:37.746 "bdev_name": "Malloc2p1" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd13", 00:07:37.746 "bdev_name": "Malloc2p2" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd14", 00:07:37.746 "bdev_name": "Malloc2p3" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd15", 00:07:37.746 "bdev_name": "Malloc2p4" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd2", 00:07:37.746 "bdev_name": "Malloc2p5" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd3", 00:07:37.746 "bdev_name": "Malloc2p6" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd4", 00:07:37.746 "bdev_name": "Malloc2p7" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd5", 00:07:37.746 "bdev_name": "TestPT" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd6", 00:07:37.746 "bdev_name": "raid0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd7", 00:07:37.746 "bdev_name": "concat0" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd8", 00:07:37.746 "bdev_name": "raid1" 00:07:37.746 }, 00:07:37.746 { 00:07:37.746 "nbd_device": "/dev/nbd9", 00:07:37.746 "bdev_name": "AIO0" 00:07:37.746 } 00:07:37.746 ]' 00:07:37.746 13:35:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:37.746 /dev/nbd1 00:07:37.746 /dev/nbd10 00:07:37.746 /dev/nbd11 00:07:37.746 /dev/nbd12 00:07:37.746 /dev/nbd13 00:07:37.746 /dev/nbd14 00:07:37.746 /dev/nbd15 00:07:37.746 /dev/nbd2 00:07:37.746 /dev/nbd3 00:07:37.746 /dev/nbd4 00:07:37.746 /dev/nbd5 00:07:37.746 /dev/nbd6 00:07:37.746 /dev/nbd7 00:07:37.746 /dev/nbd8 00:07:37.746 /dev/nbd9' 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:37.746 /dev/nbd1 00:07:37.746 /dev/nbd10 00:07:37.746 /dev/nbd11 00:07:37.746 /dev/nbd12 00:07:37.746 /dev/nbd13 00:07:37.746 /dev/nbd14 00:07:37.746 /dev/nbd15 00:07:37.746 /dev/nbd2 00:07:37.746 /dev/nbd3 00:07:37.746 /dev/nbd4 00:07:37.746 /dev/nbd5 00:07:37.746 /dev/nbd6 00:07:37.746 /dev/nbd7 00:07:37.746 /dev/nbd8 00:07:37.746 /dev/nbd9' 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:37.746 256+0 records in 00:07:37.746 256+0 records out 00:07:37.746 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0124747 s, 84.1 MB/s 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:37.746 256+0 records in 00:07:37.746 256+0 records out 00:07:37.746 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0981344 s, 10.7 MB/s 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:37.746 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:38.006 256+0 records in 00:07:38.006 256+0 records out 00:07:38.006 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0855803 s, 12.3 MB/s 00:07:38.006 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.006 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:38.006 256+0 records in 00:07:38.006 256+0 records out 00:07:38.006 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0989171 s, 10.6 MB/s 00:07:38.006 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.006 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:38.006 256+0 records in 00:07:38.006 256+0 records out 00:07:38.006 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.096084 s, 10.9 MB/s 00:07:38.006 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.006 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:38.267 256+0 records in 00:07:38.267 256+0 records out 00:07:38.267 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0923546 s, 11.4 MB/s 00:07:38.267 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.267 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:38.267 256+0 records in 00:07:38.268 256+0 records out 00:07:38.268 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0970643 s, 10.8 MB/s 00:07:38.268 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.268 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:38.596 256+0 records in 00:07:38.596 256+0 records out 00:07:38.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.094843 s, 11.1 MB/s 00:07:38.596 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.596 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:07:38.596 256+0 records in 00:07:38.596 256+0 records out 00:07:38.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0933176 s, 11.2 MB/s 00:07:38.596 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.596 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:07:38.596 256+0 records in 00:07:38.596 256+0 records out 00:07:38.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.107384 s, 9.8 MB/s 00:07:38.596 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.596 13:35:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:07:38.889 256+0 records in 00:07:38.889 256+0 records out 00:07:38.889 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.103599 s, 10.1 MB/s 00:07:38.889 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.889 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:07:38.889 256+0 records in 00:07:38.889 256+0 records out 00:07:38.889 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0937967 s, 11.2 MB/s 00:07:38.889 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.889 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:07:38.889 256+0 records in 00:07:38.889 256+0 records out 00:07:38.889 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.108747 s, 9.6 MB/s 00:07:38.889 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:38.889 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:07:39.185 256+0 records in 00:07:39.185 256+0 records out 00:07:39.185 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0974013 s, 10.8 MB/s 00:07:39.185 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:39.185 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:07:39.185 256+0 records in 00:07:39.185 256+0 records out 00:07:39.185 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.095031 s, 11.0 MB/s 00:07:39.185 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:39.185 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:07:39.185 256+0 records in 00:07:39.185 256+0 records out 00:07:39.185 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.100419 s, 10.4 MB/s 00:07:39.185 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:39.185 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:07:39.447 256+0 records in 00:07:39.447 256+0 records out 00:07:39.447 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0929006 s, 11.3 MB/s 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.447 13:35:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.709 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:39.970 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.230 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:40.491 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.752 13:35:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.752 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.013 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.275 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.535 13:35:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.796 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.057 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.317 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.578 13:35:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.578 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.839 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:43.099 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:07:43.360 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:43.622 malloc_lvol_verify 00:07:43.622 13:35:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:43.882 1be6fef0-964f-4b39-985d-eed93f436ca3 00:07:43.882 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:44.142 6173e40d-46c5-4f28-b26a-7f3df404f74b 00:07:44.142 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:44.403 /dev/nbd0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:44.403 mke2fs 1.46.5 (30-Dec-2021) 00:07:44.403 Discarding device blocks: 0/4096 done 00:07:44.403 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:44.403 00:07:44.403 Allocating group tables: 0/1 done 00:07:44.403 Writing inode tables: 0/1 done 00:07:44.403 Creating journal (1024 blocks): done 00:07:44.403 Writing superblocks and filesystem accounting information: 0/1 done 00:07:44.403 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:44.403 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1462001 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1462001 ']' 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1462001 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1462001 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1462001' 00:07:44.663 killing process with pid 1462001 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1462001 00:07:44.663 13:35:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1462001 00:07:44.663 13:35:59 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:07:44.663 00:07:44.663 real 0m20.271s 00:07:44.663 user 0m28.541s 00:07:44.663 sys 0m8.456s 00:07:44.663 13:35:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:44.663 13:35:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:44.663 ************************************ 00:07:44.663 END TEST bdev_nbd 00:07:44.663 ************************************ 00:07:44.924 13:35:59 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:07:44.924 13:35:59 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:07:44.924 13:35:59 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:07:44.924 13:35:59 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:07:44.924 13:35:59 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:07:44.924 13:35:59 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:44.924 13:35:59 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:44.924 ************************************ 00:07:44.924 START TEST bdev_fio 00:07:44.924 ************************************ 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:44.924 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:07:44.924 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:44.925 13:35:59 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:44.925 ************************************ 00:07:44.925 START TEST bdev_fio_rw_verify 00:07:44.925 ************************************ 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:44.925 13:35:59 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:45.525 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:45.525 fio-3.35 00:07:45.525 Starting 16 threads 00:07:57.763 00:07:57.763 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1467036: Mon Jun 10 13:36:10 2024 00:07:57.763 read: IOPS=119k, BW=464MiB/s (486MB/s)(4639MiB/10001msec) 00:07:57.763 slat (usec): min=2, max=505, avg=25.26, stdev=14.12 00:07:57.763 clat (usec): min=6, max=879, avg=210.42, stdev=109.34 00:07:57.763 lat (usec): min=11, max=916, avg=235.68, stdev=114.27 00:07:57.763 clat percentiles (usec): 00:07:57.763 | 50.000th=[ 202], 99.000th=[ 502], 99.900th=[ 603], 99.990th=[ 709], 00:07:57.763 | 99.999th=[ 816] 00:07:57.763 write: IOPS=184k, BW=719MiB/s (754MB/s)(7099MiB/9869msec); 0 zone resets 00:07:57.763 slat (usec): min=3, max=2544, avg=38.98, stdev=15.46 00:07:57.763 clat (usec): min=7, max=3000, avg=262.83, stdev=127.95 00:07:57.763 lat (usec): min=24, max=3024, avg=301.81, stdev=133.37 00:07:57.763 clat percentiles (usec): 00:07:57.763 | 50.000th=[ 251], 99.000th=[ 578], 99.900th=[ 717], 99.990th=[ 857], 00:07:57.763 | 99.999th=[ 1045] 00:07:57.763 bw ( KiB/s): min=667785, max=802552, per=98.97%, avg=729014.26, stdev=2823.07, samples=304 00:07:57.763 iops : min=166945, max=200638, avg=182253.21, stdev=705.78, samples=304 00:07:57.763 lat (usec) : 10=0.01%, 20=0.12%, 50=2.66%, 100=10.14%, 250=43.02% 00:07:57.763 lat (usec) : 500=41.34%, 750=2.67%, 1000=0.03% 00:07:57.763 lat (msec) : 2=0.01%, 4=0.01% 00:07:57.763 cpu : usr=99.37%, sys=0.29%, ctx=671, majf=0, minf=1960 00:07:57.763 IO depths : 1=12.4%, 2=24.8%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:07:57.763 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:57.763 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:57.763 issued rwts: total=1187476,1817346,0,0 short=0,0,0,0 dropped=0,0,0,0 00:07:57.763 latency : target=0, window=0, percentile=100.00%, depth=8 00:07:57.763 00:07:57.763 Run status group 0 (all jobs): 00:07:57.763 READ: bw=464MiB/s (486MB/s), 464MiB/s-464MiB/s (486MB/s-486MB/s), io=4639MiB (4864MB), run=10001-10001msec 00:07:57.763 WRITE: bw=719MiB/s (754MB/s), 719MiB/s-719MiB/s (754MB/s-754MB/s), io=7099MiB (7444MB), run=9869-9869msec 00:07:57.763 00:07:57.763 real 0m11.423s 00:07:57.763 user 2m58.577s 00:07:57.763 sys 0m1.024s 00:07:57.763 13:36:10 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:57.763 13:36:10 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:07:57.763 ************************************ 00:07:57.763 END TEST bdev_fio_rw_verify 00:07:57.763 ************************************ 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:07:57.763 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:57.764 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "605f33ca-2a30-4602-8efd-ef62ebee6504"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "605f33ca-2a30-4602-8efd-ef62ebee6504",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "aa879bea-8db6-5409-8436-ca3a28ad5ecb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "aa879bea-8db6-5409-8436-ca3a28ad5ecb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "7e0d8304-904a-5c1b-b76b-1a7cd55ac899"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7e0d8304-904a-5c1b-b76b-1a7cd55ac899",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "04b8b342-71f1-5236-b876-729933fd5568"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "04b8b342-71f1-5236-b876-729933fd5568",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "101d4554-9b87-583f-9dd6-f2b02515ddbe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "101d4554-9b87-583f-9dd6-f2b02515ddbe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "a8ebe4a8-6014-5838-ae63-8db4848f2dd6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a8ebe4a8-6014-5838-ae63-8db4848f2dd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "871f88f9-37d1-56b4-822e-bc9457501387"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "871f88f9-37d1-56b4-822e-bc9457501387",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "008634fc-1d7d-5583-b43b-282521449004"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "008634fc-1d7d-5583-b43b-282521449004",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "83be7e2a-bcf2-5e3f-b91c-de9c94239cf8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "83be7e2a-bcf2-5e3f-b91c-de9c94239cf8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "47b4e6f9-9632-5fca-bfb0-25b8cb44e762"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "47b4e6f9-9632-5fca-bfb0-25b8cb44e762",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "bb20bb00-2537-502c-8b05-cb511c04f658"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb20bb00-2537-502c-8b05-cb511c04f658",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e1e337c0-bc1f-5bde-ba13-29d711804bf2"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e1e337c0-bc1f-5bde-ba13-29d711804bf2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "f60adfff-c170-491a-a1b2-16090cd578fb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f60adfff-c170-491a-a1b2-16090cd578fb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f60adfff-c170-491a-a1b2-16090cd578fb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bd0b6634-6f04-467c-8725-d81bd7278549",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d48d1ea7-91bc-4dd7-8831-a53ae4b6e493",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "1f947de7-adbe-400d-8ae5-b3cec593608d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "1f947de7-adbe-400d-8ae5-b3cec593608d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "1f947de7-adbe-400d-8ae5-b3cec593608d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "45d6aa97-ac74-477d-91e7-60ab84604d76",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "34d08a68-9f2e-4b15-b85b-210f4364e65a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "78e4a155-e00d-4eab-bad9-1597a8870850",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "fbdc7dcd-953d-4fd2-aa37-55fe0c6bd9bd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "88ba31ec-9450-4ca0-8ac3-5212781ba869"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "88ba31ec-9450-4ca0-8ac3-5212781ba869",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:57.764 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:07:57.764 Malloc1p0 00:07:57.764 Malloc1p1 00:07:57.764 Malloc2p0 00:07:57.764 Malloc2p1 00:07:57.764 Malloc2p2 00:07:57.764 Malloc2p3 00:07:57.764 Malloc2p4 00:07:57.764 Malloc2p5 00:07:57.764 Malloc2p6 00:07:57.764 Malloc2p7 00:07:57.764 TestPT 00:07:57.764 raid0 00:07:57.764 concat0 ]] 00:07:57.764 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "605f33ca-2a30-4602-8efd-ef62ebee6504"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "605f33ca-2a30-4602-8efd-ef62ebee6504",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "aa879bea-8db6-5409-8436-ca3a28ad5ecb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "aa879bea-8db6-5409-8436-ca3a28ad5ecb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "7e0d8304-904a-5c1b-b76b-1a7cd55ac899"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7e0d8304-904a-5c1b-b76b-1a7cd55ac899",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "04b8b342-71f1-5236-b876-729933fd5568"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "04b8b342-71f1-5236-b876-729933fd5568",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "101d4554-9b87-583f-9dd6-f2b02515ddbe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "101d4554-9b87-583f-9dd6-f2b02515ddbe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "a8ebe4a8-6014-5838-ae63-8db4848f2dd6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a8ebe4a8-6014-5838-ae63-8db4848f2dd6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "871f88f9-37d1-56b4-822e-bc9457501387"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "871f88f9-37d1-56b4-822e-bc9457501387",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "008634fc-1d7d-5583-b43b-282521449004"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "008634fc-1d7d-5583-b43b-282521449004",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "83be7e2a-bcf2-5e3f-b91c-de9c94239cf8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "83be7e2a-bcf2-5e3f-b91c-de9c94239cf8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "47b4e6f9-9632-5fca-bfb0-25b8cb44e762"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "47b4e6f9-9632-5fca-bfb0-25b8cb44e762",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "bb20bb00-2537-502c-8b05-cb511c04f658"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb20bb00-2537-502c-8b05-cb511c04f658",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "e1e337c0-bc1f-5bde-ba13-29d711804bf2"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e1e337c0-bc1f-5bde-ba13-29d711804bf2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "f60adfff-c170-491a-a1b2-16090cd578fb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "f60adfff-c170-491a-a1b2-16090cd578fb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "f60adfff-c170-491a-a1b2-16090cd578fb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "bd0b6634-6f04-467c-8725-d81bd7278549",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d48d1ea7-91bc-4dd7-8831-a53ae4b6e493",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "1f947de7-adbe-400d-8ae5-b3cec593608d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "1f947de7-adbe-400d-8ae5-b3cec593608d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "1f947de7-adbe-400d-8ae5-b3cec593608d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "45d6aa97-ac74-477d-91e7-60ab84604d76",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "34d08a68-9f2e-4b15-b85b-210f4364e65a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "77cb98dd-2d30-4bcd-a8cb-d80b39b5df08",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "78e4a155-e00d-4eab-bad9-1597a8870850",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "fbdc7dcd-953d-4fd2-aa37-55fe0c6bd9bd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "88ba31ec-9450-4ca0-8ac3-5212781ba869"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "88ba31ec-9450-4ca0-8ac3-5212781ba869",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:57.766 13:36:10 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:57.766 ************************************ 00:07:57.766 START TEST bdev_fio_trim 00:07:57.766 ************************************ 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:07:57.766 13:36:10 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:07:57.766 13:36:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:07:57.766 13:36:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:07:57.766 13:36:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:57.766 13:36:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.766 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.766 fio-3.35 00:07:57.766 Starting 14 threads 00:08:07.766 00:08:07.766 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1470075: Mon Jun 10 13:36:21 2024 00:08:07.766 write: IOPS=171k, BW=667MiB/s (700MB/s)(6672MiB/10001msec); 0 zone resets 00:08:07.766 slat (usec): min=2, max=2220, avg=28.59, stdev=13.35 00:08:07.766 clat (usec): min=11, max=2562, avg=213.08, stdev=83.79 00:08:07.766 lat (usec): min=22, max=2601, avg=241.68, stdev=87.02 00:08:07.766 clat percentiles (usec): 00:08:07.766 | 50.000th=[ 204], 99.000th=[ 437], 99.900th=[ 502], 99.990th=[ 562], 00:08:07.766 | 99.999th=[ 725] 00:08:07.767 bw ( KiB/s): min=596296, max=779378, per=100.00%, avg=686374.05, stdev=4527.34, samples=266 00:08:07.767 iops : min=149074, max=194842, avg=171593.26, stdev=1131.81, samples=266 00:08:07.767 trim: IOPS=171k, BW=667MiB/s (700MB/s)(6672MiB/10001msec); 0 zone resets 00:08:07.767 slat (usec): min=3, max=487, avg=18.63, stdev= 8.23 00:08:07.767 clat (usec): min=3, max=2601, avg=232.51, stdev=90.78 00:08:07.767 lat (usec): min=10, max=2611, avg=251.14, stdev=93.71 00:08:07.767 clat percentiles (usec): 00:08:07.767 | 50.000th=[ 227], 99.000th=[ 461], 99.900th=[ 529], 99.990th=[ 594], 00:08:07.767 | 99.999th=[ 734] 00:08:07.767 bw ( KiB/s): min=596296, max=779394, per=100.00%, avg=686374.05, stdev=4527.44, samples=266 00:08:07.767 iops : min=149074, max=194846, avg=171593.47, stdev=1131.83, samples=266 00:08:07.767 lat (usec) : 4=0.01%, 10=0.08%, 20=0.20%, 50=0.92%, 100=5.17% 00:08:07.767 lat (usec) : 250=58.53%, 500=34.89%, 750=0.20%, 1000=0.01% 00:08:07.767 lat (msec) : 2=0.01%, 4=0.01% 00:08:07.767 cpu : usr=99.68%, sys=0.00%, ctx=635, majf=0, minf=988 00:08:07.767 IO depths : 1=12.3%, 2=24.6%, 4=50.1%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:07.767 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:07.767 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:07.767 issued rwts: total=0,1708074,1708078,0 short=0,0,0,0 dropped=0,0,0,0 00:08:07.767 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:07.767 00:08:07.767 Run status group 0 (all jobs): 00:08:07.767 WRITE: bw=667MiB/s (700MB/s), 667MiB/s-667MiB/s (700MB/s-700MB/s), io=6672MiB (6996MB), run=10001-10001msec 00:08:07.767 TRIM: bw=667MiB/s (700MB/s), 667MiB/s-667MiB/s (700MB/s-700MB/s), io=6672MiB (6996MB), run=10001-10001msec 00:08:07.767 00:08:07.767 real 0m11.266s 00:08:07.767 user 2m29.448s 00:08:07.767 sys 0m0.889s 00:08:07.767 13:36:22 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:07.767 13:36:22 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:08:07.767 ************************************ 00:08:07.767 END TEST bdev_fio_trim 00:08:07.767 ************************************ 00:08:07.767 13:36:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:08:07.767 13:36:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:08.028 13:36:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:08:08.028 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:08.028 13:36:22 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:08:08.028 00:08:08.028 real 0m23.053s 00:08:08.028 user 5m28.225s 00:08:08.028 sys 0m2.100s 00:08:08.028 13:36:22 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:08.028 13:36:22 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:08.028 ************************************ 00:08:08.028 END TEST bdev_fio 00:08:08.028 ************************************ 00:08:08.028 13:36:22 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:08.028 13:36:22 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:08.028 13:36:22 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:08:08.028 13:36:22 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:08.028 13:36:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:08.028 ************************************ 00:08:08.028 START TEST bdev_verify 00:08:08.028 ************************************ 00:08:08.028 13:36:22 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:08.028 [2024-06-10 13:36:22.385401] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:08.028 [2024-06-10 13:36:22.385444] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1472101 ] 00:08:08.028 [2024-06-10 13:36:22.473714] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:08.289 [2024-06-10 13:36:22.541883] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:08:08.289 [2024-06-10 13:36:22.541889] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.289 [2024-06-10 13:36:22.665104] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:08.289 [2024-06-10 13:36:22.665141] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:08.289 [2024-06-10 13:36:22.665151] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:08.289 [2024-06-10 13:36:22.673115] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:08.289 [2024-06-10 13:36:22.673134] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:08.289 [2024-06-10 13:36:22.681127] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:08.289 [2024-06-10 13:36:22.681144] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:08.289 [2024-06-10 13:36:22.743313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:08.289 [2024-06-10 13:36:22.743348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:08.289 [2024-06-10 13:36:22.743358] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2333c60 00:08:08.289 [2024-06-10 13:36:22.743365] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:08.289 [2024-06-10 13:36:22.744635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:08.289 [2024-06-10 13:36:22.744654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:08.550 Running I/O for 5 seconds... 00:08:13.841 00:08:13.842 Latency(us) 00:08:13.842 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:13.842 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x1000 00:08:13.842 Malloc0 : 5.08 1361.75 5.32 0.00 0.00 93823.82 464.21 368749.23 00:08:13.842 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x1000 length 0x1000 00:08:13.842 Malloc0 : 5.11 1353.95 5.29 0.00 0.00 94363.07 460.80 370496.85 00:08:13.842 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x800 00:08:13.842 Malloc1p0 : 5.08 705.84 2.76 0.00 0.00 180520.75 2184.53 212336.64 00:08:13.842 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x800 length 0x800 00:08:13.842 Malloc1p0 : 5.11 701.80 2.74 0.00 0.00 181549.28 2198.19 213210.45 00:08:13.842 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x800 00:08:13.842 Malloc1p1 : 5.08 705.58 2.76 0.00 0.00 180058.34 2252.80 207093.76 00:08:13.842 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x800 length 0x800 00:08:13.842 Malloc1p1 : 5.11 701.55 2.74 0.00 0.00 181087.60 2266.45 207967.57 00:08:13.842 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p0 : 5.21 712.95 2.78 0.00 0.00 177751.14 2293.76 203598.51 00:08:13.842 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p0 : 5.11 701.30 2.74 0.00 0.00 180663.11 2293.76 204472.32 00:08:13.842 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p1 : 5.21 712.70 2.78 0.00 0.00 177354.28 2498.56 200103.25 00:08:13.842 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p1 : 5.11 701.05 2.74 0.00 0.00 180259.46 2512.21 201850.88 00:08:13.842 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p2 : 5.21 712.45 2.78 0.00 0.00 176940.94 2785.28 193112.75 00:08:13.842 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p2 : 5.11 700.80 2.74 0.00 0.00 179814.59 2785.28 194860.37 00:08:13.842 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p3 : 5.21 712.20 2.78 0.00 0.00 176507.34 2307.41 189617.49 00:08:13.842 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p3 : 5.21 711.97 2.78 0.00 0.00 176547.50 2362.03 190491.31 00:08:13.842 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p4 : 5.21 711.93 2.78 0.00 0.00 176090.14 2430.29 183500.80 00:08:13.842 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p4 : 5.22 711.29 2.78 0.00 0.00 176213.75 2430.29 185248.43 00:08:13.842 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p5 : 5.22 711.26 2.78 0.00 0.00 175749.36 3331.41 177384.11 00:08:13.842 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p5 : 5.22 710.53 2.78 0.00 0.00 175878.16 3345.07 179131.73 00:08:13.842 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p6 : 5.22 710.49 2.78 0.00 0.00 175425.36 3249.49 173888.85 00:08:13.842 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p6 : 5.23 709.80 2.77 0.00 0.00 175567.06 3235.84 174762.67 00:08:13.842 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x200 00:08:13.842 Malloc2p7 : 5.23 709.77 2.77 0.00 0.00 175133.39 2184.53 169519.79 00:08:13.842 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x200 length 0x200 00:08:13.842 Malloc2p7 : 5.23 709.17 2.77 0.00 0.00 175267.97 2211.84 170393.60 00:08:13.842 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x0 length 0x1000 00:08:13.842 TestPT : 5.26 705.33 2.76 0.00 0.00 175592.55 23483.73 168645.97 00:08:13.842 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.842 Verification LBA range: start 0x1000 length 0x1000 00:08:13.842 TestPT : 5.24 686.50 2.68 0.00 0.00 179421.91 24576.00 170393.60 00:08:13.842 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x0 length 0x2000 00:08:13.843 raid0 : 5.24 708.89 2.77 0.00 0.00 173990.14 3249.49 141557.76 00:08:13.843 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x2000 length 0x2000 00:08:13.843 raid0 : 5.24 708.80 2.77 0.00 0.00 173994.74 3276.80 143305.39 00:08:13.843 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x0 length 0x2000 00:08:13.843 concat0 : 5.24 708.51 2.77 0.00 0.00 173629.57 2717.01 138936.32 00:08:13.843 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x2000 length 0x2000 00:08:13.843 concat0 : 5.24 708.42 2.77 0.00 0.00 173660.90 2730.67 140683.95 00:08:13.843 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x0 length 0x1000 00:08:13.843 raid1 : 5.27 728.70 2.85 0.00 0.00 168463.90 2785.28 138936.32 00:08:13.843 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x1000 length 0x1000 00:08:13.843 raid1 : 5.27 728.73 2.85 0.00 0.00 168456.49 3495.25 142431.57 00:08:13.843 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x0 length 0x4e2 00:08:13.843 AIO0 : 5.27 728.06 2.84 0.00 0.00 168204.76 1597.44 144179.20 00:08:13.843 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:13.843 Verification LBA range: start 0x4e2 length 0x4e2 00:08:13.843 AIO0 : 5.27 728.09 2.84 0.00 0.00 168174.76 1624.75 146800.64 00:08:13.843 =================================================================================================================== 00:08:13.843 Total : 24020.15 93.83 0.00 0.00 166790.67 460.80 370496.85 00:08:14.103 00:08:14.103 real 0m6.201s 00:08:14.103 user 0m11.739s 00:08:14.103 sys 0m0.252s 00:08:14.103 13:36:28 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:14.103 13:36:28 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:08:14.103 ************************************ 00:08:14.103 END TEST bdev_verify 00:08:14.103 ************************************ 00:08:14.103 13:36:28 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:14.103 13:36:28 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:08:14.103 13:36:28 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:14.103 13:36:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:14.365 ************************************ 00:08:14.365 START TEST bdev_verify_big_io 00:08:14.365 ************************************ 00:08:14.365 13:36:28 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:14.365 [2024-06-10 13:36:28.660617] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:14.365 [2024-06-10 13:36:28.660660] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1473436 ] 00:08:14.365 [2024-06-10 13:36:28.748652] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:14.365 [2024-06-10 13:36:28.816617] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.365 [2024-06-10 13:36:28.816623] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.626 [2024-06-10 13:36:28.936885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:14.626 [2024-06-10 13:36:28.936930] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:14.626 [2024-06-10 13:36:28.936939] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:14.626 [2024-06-10 13:36:28.944894] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:14.626 [2024-06-10 13:36:28.944913] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:14.626 [2024-06-10 13:36:28.952907] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:14.626 [2024-06-10 13:36:28.952923] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:14.626 [2024-06-10 13:36:29.015061] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:14.626 [2024-06-10 13:36:29.015097] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:14.626 [2024-06-10 13:36:29.015107] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240ec60 00:08:14.626 [2024-06-10 13:36:29.015114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:14.626 [2024-06-10 13:36:29.016379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:14.626 [2024-06-10 13:36:29.016398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:14.887 [2024-06-10 13:36:29.157807] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.158443] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.159408] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.160123] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.161258] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.161998] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.163122] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.164275] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.165064] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.166221] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.166998] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.168132] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.168916] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:14.887 [2024-06-10 13:36:29.170083] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:14.888 [2024-06-10 13:36:29.170770] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:14.888 [2024-06-10 13:36:29.171731] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:14.888 [2024-06-10 13:36:29.187329] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:14.888 [2024-06-10 13:36:29.188590] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:14.888 Running I/O for 5 seconds... 00:08:23.027 00:08:23.027 Latency(us) 00:08:23.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:23.027 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.027 Verification LBA range: start 0x0 length 0x100 00:08:23.028 Malloc0 : 5.77 177.43 11.09 0.00 0.00 706960.86 788.48 1915398.83 00:08:23.028 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x100 length 0x100 00:08:23.028 Malloc0 : 6.02 148.77 9.30 0.00 0.00 844476.17 778.24 2292886.19 00:08:23.028 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x80 00:08:23.028 Malloc1p0 : 6.61 36.31 2.27 0.00 0.00 3188941.21 1290.24 5368709.12 00:08:23.028 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x80 length 0x80 00:08:23.028 Malloc1p0 : 6.32 86.75 5.42 0.00 0.00 1358806.93 1966.08 2712316.59 00:08:23.028 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x80 00:08:23.028 Malloc1p1 : 6.61 36.30 2.27 0.00 0.00 3082813.17 1276.59 5172974.93 00:08:23.028 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x80 length 0x80 00:08:23.028 Malloc1p1 : 6.70 35.80 2.24 0.00 0.00 3161208.87 1283.41 5452595.20 00:08:23.028 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p0 : 6.20 25.80 1.61 0.00 0.00 1098794.98 518.83 2083170.99 00:08:23.028 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p0 : 6.20 23.21 1.45 0.00 0.00 1219470.12 535.89 1999284.91 00:08:23.028 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p1 : 6.20 25.79 1.61 0.00 0.00 1088894.15 515.41 2055208.96 00:08:23.028 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p1 : 6.21 23.20 1.45 0.00 0.00 1208829.24 522.24 1971322.88 00:08:23.028 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p2 : 6.21 25.78 1.61 0.00 0.00 1079315.77 518.83 2027246.93 00:08:23.028 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p2 : 6.21 23.19 1.45 0.00 0.00 1198881.36 522.24 1943360.85 00:08:23.028 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p3 : 6.21 25.77 1.61 0.00 0.00 1069591.61 532.48 2013265.92 00:08:23.028 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p3 : 6.21 23.17 1.45 0.00 0.00 1188204.67 525.65 1915398.83 00:08:23.028 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p4 : 6.21 25.75 1.61 0.00 0.00 1059664.54 508.59 1971322.88 00:08:23.028 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p4 : 6.22 23.16 1.45 0.00 0.00 1178035.16 518.83 1901417.81 00:08:23.028 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p5 : 6.22 25.74 1.61 0.00 0.00 1050851.62 529.07 1943360.85 00:08:23.028 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p5 : 6.32 25.32 1.58 0.00 0.00 1080292.15 532.48 1873455.79 00:08:23.028 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p6 : 6.22 25.72 1.61 0.00 0.00 1041263.09 522.24 1915398.83 00:08:23.028 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p6 : 6.32 25.32 1.58 0.00 0.00 1070508.23 515.41 1845493.76 00:08:23.028 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x20 00:08:23.028 Malloc2p7 : 6.32 27.87 1.74 0.00 0.00 960567.69 508.59 1887436.80 00:08:23.028 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x20 length 0x20 00:08:23.028 Malloc2p7 : 6.32 25.31 1.58 0.00 0.00 1060878.52 512.00 1831512.75 00:08:23.028 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x100 00:08:23.028 TestPT : 6.77 40.19 2.51 0.00 0.00 2538500.73 1276.59 4781506.56 00:08:23.028 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x100 length 0x100 00:08:23.028 TestPT : 6.76 33.70 2.11 0.00 0.00 3051850.55 111848.11 3523215.36 00:08:23.028 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x200 00:08:23.028 raid0 : 6.81 42.28 2.64 0.00 0.00 2340643.59 1392.64 4613734.40 00:08:23.028 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x200 length 0x200 00:08:23.028 raid0 : 6.65 40.90 2.56 0.00 0.00 2458043.62 1365.33 4781506.56 00:08:23.028 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x200 00:08:23.028 concat0 : 6.62 55.48 3.47 0.00 0.00 1771947.37 1378.99 4418000.21 00:08:23.028 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x200 length 0x200 00:08:23.028 concat0 : 6.81 44.61 2.79 0.00 0.00 2203611.70 1365.33 4585772.37 00:08:23.028 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x100 00:08:23.028 raid1 : 6.77 59.67 3.73 0.00 0.00 1592803.21 1761.28 4250228.05 00:08:23.028 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x100 length 0x100 00:08:23.028 raid1 : 6.77 49.65 3.10 0.00 0.00 1921967.91 1761.28 4418000.21 00:08:23.028 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x0 length 0x4e 00:08:23.028 AIO0 : 6.83 67.04 4.19 0.00 0.00 844603.41 638.29 2754259.63 00:08:23.028 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:23.028 Verification LBA range: start 0x4e length 0x4e 00:08:23.028 AIO0 : 6.81 72.68 4.54 0.00 0.00 784769.92 662.19 2908050.77 00:08:23.028 =================================================================================================================== 00:08:23.028 Total : 1427.69 89.23 0.00 0.00 1465413.96 508.59 5452595.20 00:08:23.028 00:08:23.028 real 0m7.770s 00:08:23.028 user 0m14.853s 00:08:23.028 sys 0m0.273s 00:08:23.028 13:36:36 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:23.028 13:36:36 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:23.028 ************************************ 00:08:23.028 END TEST bdev_verify_big_io 00:08:23.028 ************************************ 00:08:23.028 13:36:36 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:23.028 13:36:36 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:23.028 13:36:36 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:23.028 13:36:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:23.028 ************************************ 00:08:23.028 START TEST bdev_write_zeroes 00:08:23.028 ************************************ 00:08:23.028 13:36:36 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:23.028 [2024-06-10 13:36:36.523491] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:23.028 [2024-06-10 13:36:36.523545] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475032 ] 00:08:23.028 [2024-06-10 13:36:36.612865] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.028 [2024-06-10 13:36:36.688611] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.028 [2024-06-10 13:36:36.809464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:23.028 [2024-06-10 13:36:36.809510] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:23.028 [2024-06-10 13:36:36.809519] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:23.028 [2024-06-10 13:36:36.817474] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:23.028 [2024-06-10 13:36:36.817493] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:23.028 [2024-06-10 13:36:36.825485] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:23.028 [2024-06-10 13:36:36.825500] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:23.028 [2024-06-10 13:36:36.887439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:23.028 [2024-06-10 13:36:36.887477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:23.028 [2024-06-10 13:36:36.887486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2482d30 00:08:23.028 [2024-06-10 13:36:36.887498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:23.028 [2024-06-10 13:36:36.888823] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:23.028 [2024-06-10 13:36:36.888843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:23.028 Running I/O for 1 seconds... 00:08:23.970 00:08:23.970 Latency(us) 00:08:23.970 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:23.970 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc0 : 1.03 5587.36 21.83 0.00 0.00 22892.18 563.20 38447.79 00:08:23.970 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc1p0 : 1.03 5579.90 21.80 0.00 0.00 22885.60 808.96 37792.43 00:08:23.970 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc1p1 : 1.03 5572.52 21.77 0.00 0.00 22868.54 832.85 36918.61 00:08:23.970 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p0 : 1.04 5565.09 21.74 0.00 0.00 22850.42 819.20 36044.80 00:08:23.970 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p1 : 1.04 5557.75 21.71 0.00 0.00 22833.85 843.09 35170.99 00:08:23.970 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p2 : 1.04 5550.41 21.68 0.00 0.00 22823.81 805.55 34515.63 00:08:23.970 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p3 : 1.04 5543.07 21.65 0.00 0.00 22812.07 805.55 33641.81 00:08:23.970 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p4 : 1.04 5535.79 21.62 0.00 0.00 22798.79 829.44 32768.00 00:08:23.970 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p5 : 1.04 5528.50 21.60 0.00 0.00 22785.69 805.55 32112.64 00:08:23.970 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p6 : 1.04 5521.22 21.57 0.00 0.00 22770.04 805.55 31238.83 00:08:23.970 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 Malloc2p7 : 1.06 5576.82 21.78 0.00 0.00 22501.48 812.37 30583.47 00:08:23.970 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 TestPT : 1.06 5569.60 21.76 0.00 0.00 22484.77 856.75 29709.65 00:08:23.970 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.970 raid0 : 1.06 5561.32 21.72 0.00 0.00 22457.60 1536.00 28180.48 00:08:23.970 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.971 concat0 : 1.06 5553.23 21.69 0.00 0.00 22412.99 1522.35 26651.31 00:08:23.971 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.971 raid1 : 1.06 5543.08 21.65 0.00 0.00 22365.25 2389.33 24685.23 00:08:23.971 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:23.971 AIO0 : 1.06 5536.85 21.63 0.00 0.00 22293.11 832.85 24685.23 00:08:23.971 =================================================================================================================== 00:08:23.971 Total : 88882.51 347.20 0.00 0.00 22675.13 563.20 38447.79 00:08:23.971 00:08:23.971 real 0m1.907s 00:08:23.971 user 0m1.626s 00:08:23.971 sys 0m0.223s 00:08:23.971 13:36:38 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:23.971 13:36:38 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:08:23.971 ************************************ 00:08:23.971 END TEST bdev_write_zeroes 00:08:23.971 ************************************ 00:08:23.971 13:36:38 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:23.971 13:36:38 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:23.971 13:36:38 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:23.971 13:36:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:24.230 ************************************ 00:08:24.230 START TEST bdev_json_nonenclosed 00:08:24.230 ************************************ 00:08:24.230 13:36:38 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:24.230 [2024-06-10 13:36:38.498071] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:24.230 [2024-06-10 13:36:38.498116] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475443 ] 00:08:24.230 [2024-06-10 13:36:38.586128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.230 [2024-06-10 13:36:38.657144] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.230 [2024-06-10 13:36:38.657203] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:08:24.230 [2024-06-10 13:36:38.657216] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:24.230 [2024-06-10 13:36:38.657223] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:24.491 00:08:24.491 real 0m0.271s 00:08:24.491 user 0m0.171s 00:08:24.491 sys 0m0.099s 00:08:24.491 13:36:38 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:24.491 13:36:38 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:08:24.491 ************************************ 00:08:24.491 END TEST bdev_json_nonenclosed 00:08:24.491 ************************************ 00:08:24.491 13:36:38 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:24.491 13:36:38 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:24.491 13:36:38 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:24.491 13:36:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:24.491 ************************************ 00:08:24.491 START TEST bdev_json_nonarray 00:08:24.491 ************************************ 00:08:24.491 13:36:38 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:24.491 [2024-06-10 13:36:38.851508] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:24.491 [2024-06-10 13:36:38.851553] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475473 ] 00:08:24.491 [2024-06-10 13:36:38.940205] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.752 [2024-06-10 13:36:39.007695] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.752 [2024-06-10 13:36:39.007752] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:08:24.752 [2024-06-10 13:36:39.007764] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:24.752 [2024-06-10 13:36:39.007771] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:24.752 00:08:24.752 real 0m0.270s 00:08:24.752 user 0m0.167s 00:08:24.752 sys 0m0.101s 00:08:24.752 13:36:39 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:24.752 13:36:39 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:08:24.752 ************************************ 00:08:24.752 END TEST bdev_json_nonarray 00:08:24.752 ************************************ 00:08:24.752 13:36:39 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:08:24.752 13:36:39 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:08:24.752 13:36:39 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:08:24.752 13:36:39 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:24.752 13:36:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:24.752 ************************************ 00:08:24.752 START TEST bdev_qos 00:08:24.752 ************************************ 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # qos_test_suite '' 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=1475494 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 1475494' 00:08:24.752 Process qos testing pid: 1475494 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 1475494 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@830 -- # '[' -z 1475494 ']' 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:24.752 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:24.752 13:36:39 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:24.752 [2024-06-10 13:36:39.199401] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:24.752 [2024-06-10 13:36:39.199449] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475494 ] 00:08:25.012 [2024-06-10 13:36:39.270259] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.013 [2024-06-10 13:36:39.337967] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:08:25.584 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:25.584 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@863 -- # return 0 00:08:25.584 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:08:25.584 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:25.584 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.844 Malloc_0 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_0 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.844 [ 00:08:25.844 { 00:08:25.844 "name": "Malloc_0", 00:08:25.844 "aliases": [ 00:08:25.844 "b986cbad-768b-417b-b0ca-32d91c5fb4ea" 00:08:25.844 ], 00:08:25.844 "product_name": "Malloc disk", 00:08:25.844 "block_size": 512, 00:08:25.844 "num_blocks": 262144, 00:08:25.844 "uuid": "b986cbad-768b-417b-b0ca-32d91c5fb4ea", 00:08:25.844 "assigned_rate_limits": { 00:08:25.844 "rw_ios_per_sec": 0, 00:08:25.844 "rw_mbytes_per_sec": 0, 00:08:25.844 "r_mbytes_per_sec": 0, 00:08:25.844 "w_mbytes_per_sec": 0 00:08:25.844 }, 00:08:25.844 "claimed": false, 00:08:25.844 "zoned": false, 00:08:25.844 "supported_io_types": { 00:08:25.844 "read": true, 00:08:25.844 "write": true, 00:08:25.844 "unmap": true, 00:08:25.844 "write_zeroes": true, 00:08:25.844 "flush": true, 00:08:25.844 "reset": true, 00:08:25.844 "compare": false, 00:08:25.844 "compare_and_write": false, 00:08:25.844 "abort": true, 00:08:25.844 "nvme_admin": false, 00:08:25.844 "nvme_io": false 00:08:25.844 }, 00:08:25.844 "memory_domains": [ 00:08:25.844 { 00:08:25.844 "dma_device_id": "system", 00:08:25.844 "dma_device_type": 1 00:08:25.844 }, 00:08:25.844 { 00:08:25.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:25.844 "dma_device_type": 2 00:08:25.844 } 00:08:25.844 ], 00:08:25.844 "driver_specific": {} 00:08:25.844 } 00:08:25.844 ] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.844 Null_1 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Null_1 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:25.844 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.844 [ 00:08:25.844 { 00:08:25.844 "name": "Null_1", 00:08:25.844 "aliases": [ 00:08:25.844 "8cf9b860-df88-4fbe-b6c4-1cdbdfeb6ea0" 00:08:25.844 ], 00:08:25.844 "product_name": "Null disk", 00:08:25.844 "block_size": 512, 00:08:25.844 "num_blocks": 262144, 00:08:25.844 "uuid": "8cf9b860-df88-4fbe-b6c4-1cdbdfeb6ea0", 00:08:25.844 "assigned_rate_limits": { 00:08:25.845 "rw_ios_per_sec": 0, 00:08:25.845 "rw_mbytes_per_sec": 0, 00:08:25.845 "r_mbytes_per_sec": 0, 00:08:25.845 "w_mbytes_per_sec": 0 00:08:25.845 }, 00:08:25.845 "claimed": false, 00:08:25.845 "zoned": false, 00:08:25.845 "supported_io_types": { 00:08:25.845 "read": true, 00:08:25.845 "write": true, 00:08:25.845 "unmap": false, 00:08:25.845 "write_zeroes": true, 00:08:25.845 "flush": false, 00:08:25.845 "reset": true, 00:08:25.845 "compare": false, 00:08:25.845 "compare_and_write": false, 00:08:25.845 "abort": true, 00:08:25.845 "nvme_admin": false, 00:08:25.845 "nvme_io": false 00:08:25.845 }, 00:08:25.845 "driver_specific": {} 00:08:25.845 } 00:08:25.845 ] 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:25.845 13:36:40 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:25.845 Running I/O for 60 seconds... 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 70087.39 280349.55 0.00 0.00 282624.00 0.00 0.00 ' 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=70087.39 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 70087 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=70087 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=17000 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 17000 -gt 1000 ']' 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 17000 Malloc_0 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 17000 IOPS Malloc_0 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:31.124 13:36:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:31.124 ************************************ 00:08:31.124 START TEST bdev_qos_iops 00:08:31.124 ************************************ 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # run_qos_test 17000 IOPS Malloc_0 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=17000 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:31.124 13:36:45 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 17003.39 68013.55 0.00 0.00 68748.00 0.00 0.00 ' 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=17003.39 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 17003 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=17003 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=15300 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=18700 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 17003 -lt 15300 ']' 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 17003 -gt 18700 ']' 00:08:36.481 00:08:36.481 real 0m5.200s 00:08:36.481 user 0m0.108s 00:08:36.481 sys 0m0.036s 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:36.481 13:36:50 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:08:36.481 ************************************ 00:08:36.481 END TEST bdev_qos_iops 00:08:36.481 ************************************ 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:36.481 13:36:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 22539.70 90158.81 0.00 0.00 91136.00 0.00 0.00 ' 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=91136.00 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 91136 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=91136 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:08:41.765 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:41.766 13:36:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:41.766 ************************************ 00:08:41.766 START TEST bdev_qos_bw 00:08:41.766 ************************************ 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # run_qos_test 8 BANDWIDTH Null_1 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:41.766 13:36:55 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2047.68 8190.70 0.00 0.00 8412.00 0.00 0.00 ' 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8412.00 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8412 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8412 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8412 -lt 7372 ']' 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8412 -gt 9011 ']' 00:08:47.050 00:08:47.050 real 0m5.273s 00:08:47.050 user 0m0.107s 00:08:47.050 sys 0m0.038s 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:08:47.050 ************************************ 00:08:47.050 END TEST bdev_qos_bw 00:08:47.050 ************************************ 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:47.050 13:37:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:47.050 ************************************ 00:08:47.050 START TEST bdev_qos_ro_bw 00:08:47.050 ************************************ 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:47.050 13:37:01 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.97 2047.87 0.00 0.00 2060.00 0.00 0.00 ' 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:08:52.331 00:08:52.331 real 0m5.172s 00:08:52.331 user 0m0.102s 00:08:52.331 sys 0m0.040s 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:52.331 13:37:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:08:52.331 ************************************ 00:08:52.331 END TEST bdev_qos_ro_bw 00:08:52.331 ************************************ 00:08:52.331 13:37:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:08:52.331 13:37:06 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:52.331 13:37:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:52.590 13:37:06 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:52.590 13:37:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:08:52.590 13:37:06 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:52.590 13:37:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:52.851 00:08:52.851 Latency(us) 00:08:52.851 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:52.851 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:52.851 Malloc_0 : 26.66 23524.93 91.89 0.00 0.00 10781.16 1870.51 503316.48 00:08:52.851 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:52.851 Null_1 : 26.80 22909.75 89.49 0.00 0.00 11145.59 744.11 145053.01 00:08:52.851 =================================================================================================================== 00:08:52.851 Total : 46434.68 181.39 0.00 0.00 10961.45 744.11 503316.48 00:08:52.851 0 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 1475494 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@949 -- # '[' -z 1475494 ']' 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # kill -0 1475494 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # uname 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1475494 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1475494' 00:08:52.851 killing process with pid 1475494 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # kill 1475494 00:08:52.851 Received shutdown signal, test time was about 26.860739 seconds 00:08:52.851 00:08:52.851 Latency(us) 00:08:52.851 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:52.851 =================================================================================================================== 00:08:52.851 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:52.851 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@973 -- # wait 1475494 00:08:52.852 13:37:07 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:08:52.852 00:08:52.852 real 0m28.134s 00:08:52.852 user 0m28.910s 00:08:52.852 sys 0m0.635s 00:08:52.852 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:52.852 13:37:07 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:52.852 ************************************ 00:08:52.852 END TEST bdev_qos 00:08:52.852 ************************************ 00:08:52.852 13:37:07 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:08:52.852 13:37:07 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:08:52.852 13:37:07 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:52.852 13:37:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:53.113 ************************************ 00:08:53.113 START TEST bdev_qd_sampling 00:08:53.113 ************************************ 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # qd_sampling_test_suite '' 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=1481197 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 1481197' 00:08:53.113 Process bdev QD sampling period testing pid: 1481197 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 1481197 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@830 -- # '[' -z 1481197 ']' 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:53.113 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:53.113 13:37:07 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:53.113 [2024-06-10 13:37:07.418885] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:53.113 [2024-06-10 13:37:07.418941] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1481197 ] 00:08:53.113 [2024-06-10 13:37:07.512748] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:53.373 [2024-06-10 13:37:07.611441] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:08:53.373 [2024-06-10 13:37:07.611542] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@863 -- # return 0 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:53.944 Malloc_QD 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_QD 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local i 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:53.944 [ 00:08:53.944 { 00:08:53.944 "name": "Malloc_QD", 00:08:53.944 "aliases": [ 00:08:53.944 "d18bcc1c-a9ec-4ce1-8ef7-ada4764999f2" 00:08:53.944 ], 00:08:53.944 "product_name": "Malloc disk", 00:08:53.944 "block_size": 512, 00:08:53.944 "num_blocks": 262144, 00:08:53.944 "uuid": "d18bcc1c-a9ec-4ce1-8ef7-ada4764999f2", 00:08:53.944 "assigned_rate_limits": { 00:08:53.944 "rw_ios_per_sec": 0, 00:08:53.944 "rw_mbytes_per_sec": 0, 00:08:53.944 "r_mbytes_per_sec": 0, 00:08:53.944 "w_mbytes_per_sec": 0 00:08:53.944 }, 00:08:53.944 "claimed": false, 00:08:53.944 "zoned": false, 00:08:53.944 "supported_io_types": { 00:08:53.944 "read": true, 00:08:53.944 "write": true, 00:08:53.944 "unmap": true, 00:08:53.944 "write_zeroes": true, 00:08:53.944 "flush": true, 00:08:53.944 "reset": true, 00:08:53.944 "compare": false, 00:08:53.944 "compare_and_write": false, 00:08:53.944 "abort": true, 00:08:53.944 "nvme_admin": false, 00:08:53.944 "nvme_io": false 00:08:53.944 }, 00:08:53.944 "memory_domains": [ 00:08:53.944 { 00:08:53.944 "dma_device_id": "system", 00:08:53.944 "dma_device_type": 1 00:08:53.944 }, 00:08:53.944 { 00:08:53.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:53.944 "dma_device_type": 2 00:08:53.944 } 00:08:53.944 ], 00:08:53.944 "driver_specific": {} 00:08:53.944 } 00:08:53.944 ] 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # return 0 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:08:53.944 13:37:08 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:53.944 Running I/O for 5 seconds... 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:08:55.854 "tick_rate": 2400000000, 00:08:55.854 "ticks": 4387893560097647, 00:08:55.854 "bdevs": [ 00:08:55.854 { 00:08:55.854 "name": "Malloc_QD", 00:08:55.854 "bytes_read": 664842752, 00:08:55.854 "num_read_ops": 162308, 00:08:55.854 "bytes_written": 0, 00:08:55.854 "num_write_ops": 0, 00:08:55.854 "bytes_unmapped": 0, 00:08:55.854 "num_unmap_ops": 0, 00:08:55.854 "bytes_copied": 0, 00:08:55.854 "num_copy_ops": 0, 00:08:55.854 "read_latency_ticks": 2358962183138, 00:08:55.854 "max_read_latency_ticks": 17960406, 00:08:55.854 "min_read_latency_ticks": 229016, 00:08:55.854 "write_latency_ticks": 0, 00:08:55.854 "max_write_latency_ticks": 0, 00:08:55.854 "min_write_latency_ticks": 0, 00:08:55.854 "unmap_latency_ticks": 0, 00:08:55.854 "max_unmap_latency_ticks": 0, 00:08:55.854 "min_unmap_latency_ticks": 0, 00:08:55.854 "copy_latency_ticks": 0, 00:08:55.854 "max_copy_latency_ticks": 0, 00:08:55.854 "min_copy_latency_ticks": 0, 00:08:55.854 "io_error": {}, 00:08:55.854 "queue_depth_polling_period": 10, 00:08:55.854 "queue_depth": 512, 00:08:55.854 "io_time": 20, 00:08:55.854 "weighted_io_time": 10240 00:08:55.854 } 00:08:55.854 ] 00:08:55.854 }' 00:08:55.854 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:56.114 00:08:56.114 Latency(us) 00:08:56.114 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:56.114 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:08:56.114 Malloc_QD : 2.00 42411.15 165.67 0.00 0.00 6021.05 942.08 7536.64 00:08:56.114 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:56.114 Malloc_QD : 2.00 42650.59 166.60 0.00 0.00 5987.33 832.85 7482.03 00:08:56.114 =================================================================================================================== 00:08:56.114 Total : 85061.75 332.27 0.00 0.00 6004.14 832.85 7536.64 00:08:56.114 0 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 1481197 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@949 -- # '[' -z 1481197 ']' 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # kill -0 1481197 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # uname 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1481197 00:08:56.114 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1481197' 00:08:56.115 killing process with pid 1481197 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # kill 1481197 00:08:56.115 Received shutdown signal, test time was about 2.070444 seconds 00:08:56.115 00:08:56.115 Latency(us) 00:08:56.115 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:56.115 =================================================================================================================== 00:08:56.115 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@973 -- # wait 1481197 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:08:56.115 00:08:56.115 real 0m3.222s 00:08:56.115 user 0m6.373s 00:08:56.115 sys 0m0.324s 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:56.115 13:37:10 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:56.115 ************************************ 00:08:56.115 END TEST bdev_qd_sampling 00:08:56.115 ************************************ 00:08:56.375 13:37:10 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:08:56.375 13:37:10 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:08:56.375 13:37:10 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:56.375 13:37:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:56.375 ************************************ 00:08:56.375 START TEST bdev_error 00:08:56.375 ************************************ 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # error_test_suite '' 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=1481885 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 1481885' 00:08:56.375 Process error testing pid: 1481885 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 1481885 00:08:56.375 13:37:10 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 1481885 ']' 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:56.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:56.375 13:37:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:56.375 [2024-06-10 13:37:10.709586] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:08:56.375 [2024-06-10 13:37:10.709642] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1481885 ] 00:08:56.375 [2024-06-10 13:37:10.784819] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.636 [2024-06-10 13:37:10.856662] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:08:57.208 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 Dev_1 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 [ 00:08:57.208 { 00:08:57.208 "name": "Dev_1", 00:08:57.208 "aliases": [ 00:08:57.208 "8da0cd6f-fd79-43f2-8539-94fd6a2236c9" 00:08:57.208 ], 00:08:57.208 "product_name": "Malloc disk", 00:08:57.208 "block_size": 512, 00:08:57.208 "num_blocks": 262144, 00:08:57.208 "uuid": "8da0cd6f-fd79-43f2-8539-94fd6a2236c9", 00:08:57.208 "assigned_rate_limits": { 00:08:57.208 "rw_ios_per_sec": 0, 00:08:57.208 "rw_mbytes_per_sec": 0, 00:08:57.208 "r_mbytes_per_sec": 0, 00:08:57.208 "w_mbytes_per_sec": 0 00:08:57.208 }, 00:08:57.208 "claimed": false, 00:08:57.208 "zoned": false, 00:08:57.208 "supported_io_types": { 00:08:57.208 "read": true, 00:08:57.208 "write": true, 00:08:57.208 "unmap": true, 00:08:57.208 "write_zeroes": true, 00:08:57.208 "flush": true, 00:08:57.208 "reset": true, 00:08:57.208 "compare": false, 00:08:57.208 "compare_and_write": false, 00:08:57.208 "abort": true, 00:08:57.208 "nvme_admin": false, 00:08:57.208 "nvme_io": false 00:08:57.208 }, 00:08:57.208 "memory_domains": [ 00:08:57.208 { 00:08:57.208 "dma_device_id": "system", 00:08:57.208 "dma_device_type": 1 00:08:57.208 }, 00:08:57.208 { 00:08:57.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:57.208 "dma_device_type": 2 00:08:57.208 } 00:08:57.208 ], 00:08:57.208 "driver_specific": {} 00:08:57.208 } 00:08:57.208 ] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:08:57.208 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 true 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 Dev_2 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.208 [ 00:08:57.208 { 00:08:57.208 "name": "Dev_2", 00:08:57.208 "aliases": [ 00:08:57.208 "02a57203-b831-4bdf-a13f-bb7c3be18ab0" 00:08:57.208 ], 00:08:57.208 "product_name": "Malloc disk", 00:08:57.208 "block_size": 512, 00:08:57.208 "num_blocks": 262144, 00:08:57.208 "uuid": "02a57203-b831-4bdf-a13f-bb7c3be18ab0", 00:08:57.208 "assigned_rate_limits": { 00:08:57.208 "rw_ios_per_sec": 0, 00:08:57.208 "rw_mbytes_per_sec": 0, 00:08:57.208 "r_mbytes_per_sec": 0, 00:08:57.208 "w_mbytes_per_sec": 0 00:08:57.208 }, 00:08:57.208 "claimed": false, 00:08:57.208 "zoned": false, 00:08:57.208 "supported_io_types": { 00:08:57.208 "read": true, 00:08:57.208 "write": true, 00:08:57.208 "unmap": true, 00:08:57.208 "write_zeroes": true, 00:08:57.208 "flush": true, 00:08:57.208 "reset": true, 00:08:57.208 "compare": false, 00:08:57.208 "compare_and_write": false, 00:08:57.208 "abort": true, 00:08:57.208 "nvme_admin": false, 00:08:57.208 "nvme_io": false 00:08:57.208 }, 00:08:57.208 "memory_domains": [ 00:08:57.208 { 00:08:57.208 "dma_device_id": "system", 00:08:57.208 "dma_device_type": 1 00:08:57.208 }, 00:08:57.208 { 00:08:57.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:57.208 "dma_device_type": 2 00:08:57.208 } 00:08:57.208 ], 00:08:57.208 "driver_specific": {} 00:08:57.208 } 00:08:57.208 ] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:08:57.208 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:57.208 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:57.469 13:37:11 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:57.469 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:08:57.469 13:37:11 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:08:57.469 Running I/O for 5 seconds... 00:08:58.409 13:37:12 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 1481885 00:08:58.409 13:37:12 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 1481885' 00:08:58.409 Process is existed as continue on error is set. Pid: 1481885 00:08:58.409 13:37:12 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:08:58.409 13:37:12 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:58.409 13:37:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:58.409 13:37:12 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:58.409 13:37:12 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:08:58.409 13:37:12 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:58.409 13:37:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:58.409 13:37:12 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:58.409 13:37:12 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:08:58.409 Timeout while waiting for response: 00:08:58.409 00:08:58.409 00:09:02.606 00:09:02.606 Latency(us) 00:09:02.606 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:02.606 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:02.606 EE_Dev_1 : 0.91 41687.84 162.84 5.50 0.00 380.56 121.17 604.16 00:09:02.606 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:02.606 Dev_2 : 5.00 91071.64 355.75 0.00 0.00 172.54 60.16 10868.05 00:09:02.606 =================================================================================================================== 00:09:02.606 Total : 132759.48 518.59 5.50 0.00 188.52 60.16 10868.05 00:09:03.546 13:37:17 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 1481885 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@949 -- # '[' -z 1481885 ']' 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # kill -0 1481885 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # uname 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1481885 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1481885' 00:09:03.546 killing process with pid 1481885 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # kill 1481885 00:09:03.546 Received shutdown signal, test time was about 5.000000 seconds 00:09:03.546 00:09:03.546 Latency(us) 00:09:03.546 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:03.546 =================================================================================================================== 00:09:03.546 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@973 -- # wait 1481885 00:09:03.546 13:37:17 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=1483217 00:09:03.546 13:37:17 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 1483217' 00:09:03.546 Process error testing pid: 1483217 00:09:03.546 13:37:17 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:09:03.546 13:37:17 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 1483217 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 1483217 ']' 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:03.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:03.546 13:37:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:03.546 [2024-06-10 13:37:17.984065] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:03.546 [2024-06-10 13:37:17.984118] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1483217 ] 00:09:03.806 [2024-06-10 13:37:18.054157] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.806 [2024-06-10 13:37:18.119464] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:09:04.377 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:04.377 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:09:04.377 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:04.377 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.377 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 Dev_1 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 [ 00:09:04.638 { 00:09:04.638 "name": "Dev_1", 00:09:04.638 "aliases": [ 00:09:04.638 "ed33d2e6-8bb0-4ae8-a9f6-242f44b844ee" 00:09:04.638 ], 00:09:04.638 "product_name": "Malloc disk", 00:09:04.638 "block_size": 512, 00:09:04.638 "num_blocks": 262144, 00:09:04.638 "uuid": "ed33d2e6-8bb0-4ae8-a9f6-242f44b844ee", 00:09:04.638 "assigned_rate_limits": { 00:09:04.638 "rw_ios_per_sec": 0, 00:09:04.638 "rw_mbytes_per_sec": 0, 00:09:04.638 "r_mbytes_per_sec": 0, 00:09:04.638 "w_mbytes_per_sec": 0 00:09:04.638 }, 00:09:04.638 "claimed": false, 00:09:04.638 "zoned": false, 00:09:04.638 "supported_io_types": { 00:09:04.638 "read": true, 00:09:04.638 "write": true, 00:09:04.638 "unmap": true, 00:09:04.638 "write_zeroes": true, 00:09:04.638 "flush": true, 00:09:04.638 "reset": true, 00:09:04.638 "compare": false, 00:09:04.638 "compare_and_write": false, 00:09:04.638 "abort": true, 00:09:04.638 "nvme_admin": false, 00:09:04.638 "nvme_io": false 00:09:04.638 }, 00:09:04.638 "memory_domains": [ 00:09:04.638 { 00:09:04.638 "dma_device_id": "system", 00:09:04.638 "dma_device_type": 1 00:09:04.638 }, 00:09:04.638 { 00:09:04.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:04.638 "dma_device_type": 2 00:09:04.638 } 00:09:04.638 ], 00:09:04.638 "driver_specific": {} 00:09:04.638 } 00:09:04.638 ] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:09:04.638 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 true 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 Dev_2 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.638 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.638 [ 00:09:04.638 { 00:09:04.638 "name": "Dev_2", 00:09:04.638 "aliases": [ 00:09:04.638 "30cda72f-9155-4085-a970-3008c862a3aa" 00:09:04.638 ], 00:09:04.638 "product_name": "Malloc disk", 00:09:04.638 "block_size": 512, 00:09:04.638 "num_blocks": 262144, 00:09:04.638 "uuid": "30cda72f-9155-4085-a970-3008c862a3aa", 00:09:04.638 "assigned_rate_limits": { 00:09:04.638 "rw_ios_per_sec": 0, 00:09:04.638 "rw_mbytes_per_sec": 0, 00:09:04.638 "r_mbytes_per_sec": 0, 00:09:04.638 "w_mbytes_per_sec": 0 00:09:04.638 }, 00:09:04.638 "claimed": false, 00:09:04.638 "zoned": false, 00:09:04.638 "supported_io_types": { 00:09:04.638 "read": true, 00:09:04.639 "write": true, 00:09:04.639 "unmap": true, 00:09:04.639 "write_zeroes": true, 00:09:04.639 "flush": true, 00:09:04.639 "reset": true, 00:09:04.639 "compare": false, 00:09:04.639 "compare_and_write": false, 00:09:04.639 "abort": true, 00:09:04.639 "nvme_admin": false, 00:09:04.639 "nvme_io": false 00:09:04.639 }, 00:09:04.639 "memory_domains": [ 00:09:04.639 { 00:09:04.639 "dma_device_id": "system", 00:09:04.639 "dma_device_type": 1 00:09:04.639 }, 00:09:04.639 { 00:09:04.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:04.639 "dma_device_type": 2 00:09:04.639 } 00:09:04.639 ], 00:09:04.639 "driver_specific": {} 00:09:04.639 } 00:09:04.639 ] 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:09:04.639 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:04.639 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 1483217 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@649 -- # local es=0 00:09:04.639 13:37:18 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # valid_exec_arg wait 1483217 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@637 -- # local arg=wait 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # type -t wait 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:04.639 13:37:18 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # wait 1483217 00:09:04.639 Running I/O for 5 seconds... 00:09:04.639 task offset: 204528 on job bdev=EE_Dev_1 fails 00:09:04.639 00:09:04.639 Latency(us) 00:09:04.639 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:04.639 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:04.639 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:09:04.639 EE_Dev_1 : 0.00 33536.59 131.00 7621.95 0.00 322.83 120.32 573.44 00:09:04.639 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:04.639 Dev_2 : 0.00 20189.27 78.86 0.00 0.00 595.31 113.49 1112.75 00:09:04.639 =================================================================================================================== 00:09:04.639 Total : 53725.86 209.87 7621.95 0.00 470.61 113.49 1112.75 00:09:04.639 [2024-06-10 13:37:19.051502] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:04.639 request: 00:09:04.639 { 00:09:04.639 "method": "perform_tests", 00:09:04.639 "req_id": 1 00:09:04.639 } 00:09:04.639 Got JSON-RPC error response 00:09:04.639 response: 00:09:04.639 { 00:09:04.639 "code": -32603, 00:09:04.639 "message": "bdevperf failed with error Operation not permitted" 00:09:04.639 } 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # es=255 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # es=127 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # case "$es" in 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@669 -- # es=1 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:09:04.900 00:09:04.900 real 0m8.560s 00:09:04.900 user 0m9.093s 00:09:04.900 sys 0m0.550s 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:04.900 13:37:19 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:04.900 ************************************ 00:09:04.900 END TEST bdev_error 00:09:04.900 ************************************ 00:09:04.900 13:37:19 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:09:04.900 13:37:19 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:04.900 13:37:19 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:04.900 13:37:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:04.900 ************************************ 00:09:04.900 START TEST bdev_stat 00:09:04.900 ************************************ 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # stat_test_suite '' 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=1483564 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 1483564' 00:09:04.900 Process Bdev IO statistics testing pid: 1483564 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 1483564 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@830 -- # '[' -z 1483564 ']' 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:04.900 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:04.900 13:37:19 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:04.900 [2024-06-10 13:37:19.338146] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:04.900 [2024-06-10 13:37:19.338207] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1483564 ] 00:09:05.160 [2024-06-10 13:37:19.412085] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:05.160 [2024-06-10 13:37:19.483181] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:09:05.160 [2024-06-10 13:37:19.483269] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@863 -- # return 0 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:06.102 Malloc_STAT 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_STAT 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local i 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:06.102 [ 00:09:06.102 { 00:09:06.102 "name": "Malloc_STAT", 00:09:06.102 "aliases": [ 00:09:06.102 "ca101b3f-6fa6-49b1-8009-828f0258636b" 00:09:06.102 ], 00:09:06.102 "product_name": "Malloc disk", 00:09:06.102 "block_size": 512, 00:09:06.102 "num_blocks": 262144, 00:09:06.102 "uuid": "ca101b3f-6fa6-49b1-8009-828f0258636b", 00:09:06.102 "assigned_rate_limits": { 00:09:06.102 "rw_ios_per_sec": 0, 00:09:06.102 "rw_mbytes_per_sec": 0, 00:09:06.102 "r_mbytes_per_sec": 0, 00:09:06.102 "w_mbytes_per_sec": 0 00:09:06.102 }, 00:09:06.102 "claimed": false, 00:09:06.102 "zoned": false, 00:09:06.102 "supported_io_types": { 00:09:06.102 "read": true, 00:09:06.102 "write": true, 00:09:06.102 "unmap": true, 00:09:06.102 "write_zeroes": true, 00:09:06.102 "flush": true, 00:09:06.102 "reset": true, 00:09:06.102 "compare": false, 00:09:06.102 "compare_and_write": false, 00:09:06.102 "abort": true, 00:09:06.102 "nvme_admin": false, 00:09:06.102 "nvme_io": false 00:09:06.102 }, 00:09:06.102 "memory_domains": [ 00:09:06.102 { 00:09:06.102 "dma_device_id": "system", 00:09:06.102 "dma_device_type": 1 00:09:06.102 }, 00:09:06.102 { 00:09:06.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:06.102 "dma_device_type": 2 00:09:06.102 } 00:09:06.102 ], 00:09:06.102 "driver_specific": {} 00:09:06.102 } 00:09:06.102 ] 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # return 0 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:09:06.102 13:37:20 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:06.102 Running I/O for 10 seconds... 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.014 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:09:08.014 "tick_rate": 2400000000, 00:09:08.014 "ticks": 4387922234328551, 00:09:08.014 "bdevs": [ 00:09:08.014 { 00:09:08.014 "name": "Malloc_STAT", 00:09:08.014 "bytes_read": 804303360, 00:09:08.014 "num_read_ops": 196356, 00:09:08.015 "bytes_written": 0, 00:09:08.015 "num_write_ops": 0, 00:09:08.015 "bytes_unmapped": 0, 00:09:08.015 "num_unmap_ops": 0, 00:09:08.015 "bytes_copied": 0, 00:09:08.015 "num_copy_ops": 0, 00:09:08.015 "read_latency_ticks": 2341073597638, 00:09:08.015 "max_read_latency_ticks": 14637304, 00:09:08.015 "min_read_latency_ticks": 231354, 00:09:08.015 "write_latency_ticks": 0, 00:09:08.015 "max_write_latency_ticks": 0, 00:09:08.015 "min_write_latency_ticks": 0, 00:09:08.015 "unmap_latency_ticks": 0, 00:09:08.015 "max_unmap_latency_ticks": 0, 00:09:08.015 "min_unmap_latency_ticks": 0, 00:09:08.015 "copy_latency_ticks": 0, 00:09:08.015 "max_copy_latency_ticks": 0, 00:09:08.015 "min_copy_latency_ticks": 0, 00:09:08.015 "io_error": {} 00:09:08.015 } 00:09:08.015 ] 00:09:08.015 }' 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=196356 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:09:08.015 "tick_rate": 2400000000, 00:09:08.015 "ticks": 4387922387794183, 00:09:08.015 "name": "Malloc_STAT", 00:09:08.015 "channels": [ 00:09:08.015 { 00:09:08.015 "thread_id": 2, 00:09:08.015 "bytes_read": 415236096, 00:09:08.015 "num_read_ops": 101376, 00:09:08.015 "bytes_written": 0, 00:09:08.015 "num_write_ops": 0, 00:09:08.015 "bytes_unmapped": 0, 00:09:08.015 "num_unmap_ops": 0, 00:09:08.015 "bytes_copied": 0, 00:09:08.015 "num_copy_ops": 0, 00:09:08.015 "read_latency_ticks": 1209913900634, 00:09:08.015 "max_read_latency_ticks": 14637304, 00:09:08.015 "min_read_latency_ticks": 6425972, 00:09:08.015 "write_latency_ticks": 0, 00:09:08.015 "max_write_latency_ticks": 0, 00:09:08.015 "min_write_latency_ticks": 0, 00:09:08.015 "unmap_latency_ticks": 0, 00:09:08.015 "max_unmap_latency_ticks": 0, 00:09:08.015 "min_unmap_latency_ticks": 0, 00:09:08.015 "copy_latency_ticks": 0, 00:09:08.015 "max_copy_latency_ticks": 0, 00:09:08.015 "min_copy_latency_ticks": 0 00:09:08.015 }, 00:09:08.015 { 00:09:08.015 "thread_id": 3, 00:09:08.015 "bytes_read": 419430400, 00:09:08.015 "num_read_ops": 102400, 00:09:08.015 "bytes_written": 0, 00:09:08.015 "num_write_ops": 0, 00:09:08.015 "bytes_unmapped": 0, 00:09:08.015 "num_unmap_ops": 0, 00:09:08.015 "bytes_copied": 0, 00:09:08.015 "num_copy_ops": 0, 00:09:08.015 "read_latency_ticks": 1212416733350, 00:09:08.015 "max_read_latency_ticks": 14577050, 00:09:08.015 "min_read_latency_ticks": 6409276, 00:09:08.015 "write_latency_ticks": 0, 00:09:08.015 "max_write_latency_ticks": 0, 00:09:08.015 "min_write_latency_ticks": 0, 00:09:08.015 "unmap_latency_ticks": 0, 00:09:08.015 "max_unmap_latency_ticks": 0, 00:09:08.015 "min_unmap_latency_ticks": 0, 00:09:08.015 "copy_latency_ticks": 0, 00:09:08.015 "max_copy_latency_ticks": 0, 00:09:08.015 "min_copy_latency_ticks": 0 00:09:08.015 } 00:09:08.015 ] 00:09:08.015 }' 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=101376 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=101376 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=102400 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=203776 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:09:08.015 "tick_rate": 2400000000, 00:09:08.015 "ticks": 4387922660327731, 00:09:08.015 "bdevs": [ 00:09:08.015 { 00:09:08.015 "name": "Malloc_STAT", 00:09:08.015 "bytes_read": 882946560, 00:09:08.015 "num_read_ops": 215556, 00:09:08.015 "bytes_written": 0, 00:09:08.015 "num_write_ops": 0, 00:09:08.015 "bytes_unmapped": 0, 00:09:08.015 "num_unmap_ops": 0, 00:09:08.015 "bytes_copied": 0, 00:09:08.015 "num_copy_ops": 0, 00:09:08.015 "read_latency_ticks": 2561183996548, 00:09:08.015 "max_read_latency_ticks": 14637304, 00:09:08.015 "min_read_latency_ticks": 231354, 00:09:08.015 "write_latency_ticks": 0, 00:09:08.015 "max_write_latency_ticks": 0, 00:09:08.015 "min_write_latency_ticks": 0, 00:09:08.015 "unmap_latency_ticks": 0, 00:09:08.015 "max_unmap_latency_ticks": 0, 00:09:08.015 "min_unmap_latency_ticks": 0, 00:09:08.015 "copy_latency_ticks": 0, 00:09:08.015 "max_copy_latency_ticks": 0, 00:09:08.015 "min_copy_latency_ticks": 0, 00:09:08.015 "io_error": {} 00:09:08.015 } 00:09:08.015 ] 00:09:08.015 }' 00:09:08.015 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=215556 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 203776 -lt 196356 ']' 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 203776 -gt 215556 ']' 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:08.275 00:09:08.275 Latency(us) 00:09:08.275 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:08.275 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:08.275 Malloc_STAT : 2.16 51486.12 201.12 0.00 0.00 4961.05 1003.52 6116.69 00:09:08.275 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:08.275 Malloc_STAT : 2.16 51925.70 202.83 0.00 0.00 4919.13 843.09 6089.39 00:09:08.275 =================================================================================================================== 00:09:08.275 Total : 103411.82 403.95 0.00 0.00 4939.99 843.09 6116.69 00:09:08.275 0 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 1483564 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@949 -- # '[' -z 1483564 ']' 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # kill -0 1483564 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # uname 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1483564 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1483564' 00:09:08.275 killing process with pid 1483564 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # kill 1483564 00:09:08.275 Received shutdown signal, test time was about 2.229676 seconds 00:09:08.275 00:09:08.275 Latency(us) 00:09:08.275 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:08.275 =================================================================================================================== 00:09:08.275 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@973 -- # wait 1483564 00:09:08.275 13:37:22 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:09:08.275 00:09:08.275 real 0m3.437s 00:09:08.275 user 0m7.083s 00:09:08.275 sys 0m0.302s 00:09:08.276 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:08.276 13:37:22 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:08.276 ************************************ 00:09:08.276 END TEST bdev_stat 00:09:08.276 ************************************ 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:09:08.536 13:37:22 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:09:08.536 00:09:08.536 real 1m48.229s 00:09:08.536 user 7m23.547s 00:09:08.536 sys 0m15.069s 00:09:08.536 13:37:22 blockdev_general -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:08.536 13:37:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:08.536 ************************************ 00:09:08.536 END TEST blockdev_general 00:09:08.536 ************************************ 00:09:08.536 13:37:22 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:08.536 13:37:22 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:09:08.536 13:37:22 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:08.536 13:37:22 -- common/autotest_common.sh@10 -- # set +x 00:09:08.536 ************************************ 00:09:08.536 START TEST bdev_raid 00:09:08.536 ************************************ 00:09:08.536 13:37:22 bdev_raid -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:08.536 * Looking for test storage... 00:09:08.536 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:08.536 13:37:22 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:09:08.536 13:37:22 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:09:08.536 13:37:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:08.536 13:37:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:08.536 13:37:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:08.797 ************************************ 00:09:08.797 START TEST raid_function_test_raid0 00:09:08.797 ************************************ 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # raid_function_test raid0 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1484319 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1484319' 00:09:08.797 Process raid pid: 1484319 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1484319 /var/tmp/spdk-raid.sock 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@830 -- # '[' -z 1484319 ']' 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:08.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:08.797 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:08.797 [2024-06-10 13:37:23.067396] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:08.797 [2024-06-10 13:37:23.067445] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:08.797 [2024-06-10 13:37:23.157160] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.797 [2024-06-10 13:37:23.223594] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.797 [2024-06-10 13:37:23.266370] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:08.797 [2024-06-10 13:37:23.266389] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@863 -- # return 0 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:09:09.738 13:37:23 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:09.738 [2024-06-10 13:37:24.155509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:09.738 [2024-06-10 13:37:24.156678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:09.738 [2024-06-10 13:37:24.156721] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe56890 00:09:09.738 [2024-06-10 13:37:24.156727] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:09.738 [2024-06-10 13:37:24.156880] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe56c70 00:09:09.738 [2024-06-10 13:37:24.156973] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe56890 00:09:09.738 [2024-06-10 13:37:24.156979] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xe56890 00:09:09.738 [2024-06-10 13:37:24.157064] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:09.738 Base_1 00:09:09.738 Base_2 00:09:09.738 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:09.738 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:09.738 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:09.998 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:09.999 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:10.260 [2024-06-10 13:37:24.576582] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe56c70 00:09:10.260 /dev/nbd0 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local i 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # break 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.260 1+0 records in 00:09:10.260 1+0 records out 00:09:10.260 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274397 s, 14.9 MB/s 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # size=4096 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # return 0 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:10.260 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:10.520 { 00:09:10.520 "nbd_device": "/dev/nbd0", 00:09:10.520 "bdev_name": "raid" 00:09:10.520 } 00:09:10.520 ]' 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:10.520 { 00:09:10.520 "nbd_device": "/dev/nbd0", 00:09:10.520 "bdev_name": "raid" 00:09:10.520 } 00:09:10.520 ]' 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:10.520 4096+0 records in 00:09:10.520 4096+0 records out 00:09:10.520 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.028116 s, 74.6 MB/s 00:09:10.520 13:37:24 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:10.780 4096+0 records in 00:09:10.780 4096+0 records out 00:09:10.780 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.180585 s, 11.6 MB/s 00:09:10.780 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:10.780 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:10.780 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:10.780 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:10.780 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:10.781 128+0 records in 00:09:10.781 128+0 records out 00:09:10.781 65536 bytes (66 kB, 64 KiB) copied, 0.000361359 s, 181 MB/s 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:10.781 2035+0 records in 00:09:10.781 2035+0 records out 00:09:10.781 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00476915 s, 218 MB/s 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:10.781 456+0 records in 00:09:10.781 456+0 records out 00:09:10.781 233472 bytes (233 kB, 228 KiB) copied, 0.00113986 s, 205 MB/s 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:10.781 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:11.041 [2024-06-10 13:37:25.415412] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:11.041 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1484319 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@949 -- # '[' -z 1484319 ']' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # kill -0 1484319 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # uname 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1484319 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1484319' 00:09:11.302 killing process with pid 1484319 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # kill 1484319 00:09:11.302 [2024-06-10 13:37:25.748812] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:11.302 [2024-06-10 13:37:25.748860] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:11.302 [2024-06-10 13:37:25.748890] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:11.302 [2024-06-10 13:37:25.748896] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe56890 name raid, state offline 00:09:11.302 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@973 -- # wait 1484319 00:09:11.302 [2024-06-10 13:37:25.758335] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:11.563 13:37:25 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:09:11.563 00:09:11.563 real 0m2.867s 00:09:11.563 user 0m4.058s 00:09:11.563 sys 0m0.799s 00:09:11.563 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:11.563 13:37:25 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:11.563 ************************************ 00:09:11.563 END TEST raid_function_test_raid0 00:09:11.563 ************************************ 00:09:11.563 13:37:25 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:09:11.563 13:37:25 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:11.563 13:37:25 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:11.563 13:37:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:11.563 ************************************ 00:09:11.563 START TEST raid_function_test_concat 00:09:11.563 ************************************ 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # raid_function_test concat 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1485058 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1485058' 00:09:11.563 Process raid pid: 1485058 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1485058 /var/tmp/spdk-raid.sock 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@830 -- # '[' -z 1485058 ']' 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:11.563 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:11.563 13:37:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:11.563 [2024-06-10 13:37:26.004598] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:11.563 [2024-06-10 13:37:26.004641] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:11.823 [2024-06-10 13:37:26.094111] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.824 [2024-06-10 13:37:26.159772] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.824 [2024-06-10 13:37:26.201205] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:11.824 [2024-06-10 13:37:26.201228] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@863 -- # return 0 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:09:12.395 13:37:26 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:12.684 [2024-06-10 13:37:27.033783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:12.684 [2024-06-10 13:37:27.034955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:12.684 [2024-06-10 13:37:27.034997] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xad2890 00:09:12.684 [2024-06-10 13:37:27.035003] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:12.684 [2024-06-10 13:37:27.035158] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad4110 00:09:12.684 [2024-06-10 13:37:27.035269] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xad2890 00:09:12.684 [2024-06-10 13:37:27.035279] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xad2890 00:09:12.684 [2024-06-10 13:37:27.035359] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:12.684 Base_1 00:09:12.684 Base_2 00:09:12.684 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:12.684 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:12.684 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:12.978 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:13.249 [2024-06-10 13:37:27.450846] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xad4110 00:09:13.249 /dev/nbd0 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local i 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # break 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:13.249 1+0 records in 00:09:13.249 1+0 records out 00:09:13.249 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000179756 s, 22.8 MB/s 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # size=4096 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # return 0 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:13.249 { 00:09:13.249 "nbd_device": "/dev/nbd0", 00:09:13.249 "bdev_name": "raid" 00:09:13.249 } 00:09:13.249 ]' 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:13.249 { 00:09:13.249 "nbd_device": "/dev/nbd0", 00:09:13.249 "bdev_name": "raid" 00:09:13.249 } 00:09:13.249 ]' 00:09:13.249 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:13.509 4096+0 records in 00:09:13.509 4096+0 records out 00:09:13.509 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0277695 s, 75.5 MB/s 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:13.509 4096+0 records in 00:09:13.509 4096+0 records out 00:09:13.509 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.175056 s, 12.0 MB/s 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:13.509 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:13.770 128+0 records in 00:09:13.770 128+0 records out 00:09:13.770 65536 bytes (66 kB, 64 KiB) copied, 0.000371045 s, 177 MB/s 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:13.770 13:37:27 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:13.770 2035+0 records in 00:09:13.770 2035+0 records out 00:09:13.770 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00489772 s, 213 MB/s 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:13.770 456+0 records in 00:09:13.770 456+0 records out 00:09:13.770 233472 bytes (233 kB, 228 KiB) copied, 0.00115541 s, 202 MB/s 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:09:13.770 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.771 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.031 [2024-06-10 13:37:28.278774] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:14.031 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1485058 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@949 -- # '[' -z 1485058 ']' 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # kill -0 1485058 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # uname 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1485058 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1485058' 00:09:14.292 killing process with pid 1485058 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # kill 1485058 00:09:14.292 [2024-06-10 13:37:28.601055] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:14.292 [2024-06-10 13:37:28.601106] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:14.292 [2024-06-10 13:37:28.601138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:14.292 [2024-06-10 13:37:28.601145] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xad2890 name raid, state offline 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@973 -- # wait 1485058 00:09:14.292 [2024-06-10 13:37:28.610821] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:09:14.292 00:09:14.292 real 0m2.782s 00:09:14.292 user 0m3.874s 00:09:14.292 sys 0m0.825s 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:14.292 13:37:28 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:14.292 ************************************ 00:09:14.292 END TEST raid_function_test_concat 00:09:14.292 ************************************ 00:09:14.552 13:37:28 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:09:14.552 13:37:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:09:14.552 13:37:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:14.552 13:37:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:14.552 ************************************ 00:09:14.553 START TEST raid0_resize_test 00:09:14.553 ************************************ 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # raid0_resize_test 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=1485800 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 1485800' 00:09:14.553 Process raid pid: 1485800 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 1485800 /var/tmp/spdk-raid.sock 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@830 -- # '[' -z 1485800 ']' 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:14.553 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:14.553 13:37:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:14.553 [2024-06-10 13:37:28.865841] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:14.553 [2024-06-10 13:37:28.865888] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:14.553 [2024-06-10 13:37:28.955877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.553 [2024-06-10 13:37:29.022440] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.813 [2024-06-10 13:37:29.062209] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:14.813 [2024-06-10 13:37:29.062231] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:15.384 13:37:29 bdev_raid.raid0_resize_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:15.385 13:37:29 bdev_raid.raid0_resize_test -- common/autotest_common.sh@863 -- # return 0 00:09:15.385 13:37:29 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:09:15.646 Base_1 00:09:15.646 13:37:29 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:09:15.907 Base_2 00:09:15.907 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:09:15.907 [2024-06-10 13:37:30.359150] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:15.907 [2024-06-10 13:37:30.360384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:15.907 [2024-06-10 13:37:30.360419] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25f1130 00:09:15.907 [2024-06-10 13:37:30.360429] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:15.907 [2024-06-10 13:37:30.360597] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x243d8b0 00:09:15.907 [2024-06-10 13:37:30.360672] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25f1130 00:09:15.907 [2024-06-10 13:37:30.360678] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x25f1130 00:09:15.907 [2024-06-10 13:37:30.360755] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:15.907 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:09:16.168 [2024-06-10 13:37:30.559639] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:16.168 [2024-06-10 13:37:30.559650] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:09:16.168 true 00:09:16.168 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:16.168 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:09:16.429 [2024-06-10 13:37:30.776287] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:16.429 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:09:16.429 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:09:16.429 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:09:16.429 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:09:16.689 [2024-06-10 13:37:30.980708] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:16.689 [2024-06-10 13:37:30.980718] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:09:16.689 [2024-06-10 13:37:30.980732] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:09:16.689 true 00:09:16.689 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:16.689 13:37:30 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:09:16.950 [2024-06-10 13:37:31.185334] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 1485800 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@949 -- # '[' -z 1485800 ']' 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # kill -0 1485800 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # uname 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1485800 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1485800' 00:09:16.950 killing process with pid 1485800 00:09:16.950 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # kill 1485800 00:09:16.951 [2024-06-10 13:37:31.254414] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:16.951 [2024-06-10 13:37:31.254453] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:16.951 [2024-06-10 13:37:31.254485] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:16.951 [2024-06-10 13:37:31.254496] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f1130 name Raid, state offline 00:09:16.951 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@973 -- # wait 1485800 00:09:16.951 [2024-06-10 13:37:31.255445] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:16.951 13:37:31 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:09:16.951 00:09:16.951 real 0m2.560s 00:09:16.951 user 0m4.019s 00:09:16.951 sys 0m0.454s 00:09:16.951 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:16.951 13:37:31 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:16.951 ************************************ 00:09:16.951 END TEST raid0_resize_test 00:09:16.951 ************************************ 00:09:16.951 13:37:31 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:09:16.951 13:37:31 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:16.951 13:37:31 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:09:16.951 13:37:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:16.951 13:37:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:16.951 13:37:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:17.212 ************************************ 00:09:17.212 START TEST raid_state_function_test 00:09:17.212 ************************************ 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 false 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:17.212 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1486184 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1486184' 00:09:17.213 Process raid pid: 1486184 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1486184 /var/tmp/spdk-raid.sock 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1486184 ']' 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:17.213 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:17.213 13:37:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:17.213 [2024-06-10 13:37:31.519482] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:17.213 [2024-06-10 13:37:31.519549] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:17.213 [2024-06-10 13:37:31.612902] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.213 [2024-06-10 13:37:31.681898] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.474 [2024-06-10 13:37:31.728289] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:17.474 [2024-06-10 13:37:31.728314] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:18.045 13:37:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:18.045 13:37:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:09:18.045 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:18.306 [2024-06-10 13:37:32.556608] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:18.306 [2024-06-10 13:37:32.556635] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:18.306 [2024-06-10 13:37:32.556642] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:18.306 [2024-06-10 13:37:32.556648] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:18.306 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:18.567 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:18.567 "name": "Existed_Raid", 00:09:18.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:18.567 "strip_size_kb": 64, 00:09:18.567 "state": "configuring", 00:09:18.567 "raid_level": "raid0", 00:09:18.567 "superblock": false, 00:09:18.567 "num_base_bdevs": 2, 00:09:18.567 "num_base_bdevs_discovered": 0, 00:09:18.567 "num_base_bdevs_operational": 2, 00:09:18.567 "base_bdevs_list": [ 00:09:18.567 { 00:09:18.567 "name": "BaseBdev1", 00:09:18.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:18.567 "is_configured": false, 00:09:18.567 "data_offset": 0, 00:09:18.567 "data_size": 0 00:09:18.567 }, 00:09:18.567 { 00:09:18.567 "name": "BaseBdev2", 00:09:18.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:18.567 "is_configured": false, 00:09:18.567 "data_offset": 0, 00:09:18.567 "data_size": 0 00:09:18.567 } 00:09:18.567 ] 00:09:18.567 }' 00:09:18.567 13:37:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:18.567 13:37:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:19.138 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:19.138 [2024-06-10 13:37:33.543103] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:19.138 [2024-06-10 13:37:33.543122] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141f720 name Existed_Raid, state configuring 00:09:19.138 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:19.399 [2024-06-10 13:37:33.747635] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:19.399 [2024-06-10 13:37:33.747652] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:19.399 [2024-06-10 13:37:33.747658] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:19.399 [2024-06-10 13:37:33.747664] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:19.399 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:19.659 [2024-06-10 13:37:33.959032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:19.659 BaseBdev1 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:19.659 13:37:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:19.920 [ 00:09:19.920 { 00:09:19.920 "name": "BaseBdev1", 00:09:19.920 "aliases": [ 00:09:19.920 "6a1dfece-cce4-4761-a8c6-12e8e6c10f18" 00:09:19.920 ], 00:09:19.920 "product_name": "Malloc disk", 00:09:19.920 "block_size": 512, 00:09:19.920 "num_blocks": 65536, 00:09:19.920 "uuid": "6a1dfece-cce4-4761-a8c6-12e8e6c10f18", 00:09:19.920 "assigned_rate_limits": { 00:09:19.920 "rw_ios_per_sec": 0, 00:09:19.920 "rw_mbytes_per_sec": 0, 00:09:19.920 "r_mbytes_per_sec": 0, 00:09:19.920 "w_mbytes_per_sec": 0 00:09:19.920 }, 00:09:19.920 "claimed": true, 00:09:19.920 "claim_type": "exclusive_write", 00:09:19.920 "zoned": false, 00:09:19.920 "supported_io_types": { 00:09:19.920 "read": true, 00:09:19.920 "write": true, 00:09:19.920 "unmap": true, 00:09:19.920 "write_zeroes": true, 00:09:19.920 "flush": true, 00:09:19.920 "reset": true, 00:09:19.920 "compare": false, 00:09:19.920 "compare_and_write": false, 00:09:19.920 "abort": true, 00:09:19.920 "nvme_admin": false, 00:09:19.920 "nvme_io": false 00:09:19.920 }, 00:09:19.920 "memory_domains": [ 00:09:19.920 { 00:09:19.920 "dma_device_id": "system", 00:09:19.920 "dma_device_type": 1 00:09:19.920 }, 00:09:19.920 { 00:09:19.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.920 "dma_device_type": 2 00:09:19.920 } 00:09:19.920 ], 00:09:19.920 "driver_specific": {} 00:09:19.920 } 00:09:19.920 ] 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:19.920 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:20.181 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:20.181 "name": "Existed_Raid", 00:09:20.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:20.181 "strip_size_kb": 64, 00:09:20.181 "state": "configuring", 00:09:20.181 "raid_level": "raid0", 00:09:20.181 "superblock": false, 00:09:20.181 "num_base_bdevs": 2, 00:09:20.181 "num_base_bdevs_discovered": 1, 00:09:20.181 "num_base_bdevs_operational": 2, 00:09:20.181 "base_bdevs_list": [ 00:09:20.181 { 00:09:20.181 "name": "BaseBdev1", 00:09:20.181 "uuid": "6a1dfece-cce4-4761-a8c6-12e8e6c10f18", 00:09:20.181 "is_configured": true, 00:09:20.181 "data_offset": 0, 00:09:20.181 "data_size": 65536 00:09:20.181 }, 00:09:20.181 { 00:09:20.181 "name": "BaseBdev2", 00:09:20.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:20.181 "is_configured": false, 00:09:20.181 "data_offset": 0, 00:09:20.181 "data_size": 0 00:09:20.181 } 00:09:20.181 ] 00:09:20.181 }' 00:09:20.181 13:37:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:20.181 13:37:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:20.753 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:21.014 [2024-06-10 13:37:35.302433] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:21.014 [2024-06-10 13:37:35.302458] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141f010 name Existed_Raid, state configuring 00:09:21.014 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:21.274 [2024-06-10 13:37:35.490941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:21.274 [2024-06-10 13:37:35.492169] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:21.274 [2024-06-10 13:37:35.492192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:21.274 "name": "Existed_Raid", 00:09:21.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:21.274 "strip_size_kb": 64, 00:09:21.274 "state": "configuring", 00:09:21.274 "raid_level": "raid0", 00:09:21.274 "superblock": false, 00:09:21.274 "num_base_bdevs": 2, 00:09:21.274 "num_base_bdevs_discovered": 1, 00:09:21.274 "num_base_bdevs_operational": 2, 00:09:21.274 "base_bdevs_list": [ 00:09:21.274 { 00:09:21.274 "name": "BaseBdev1", 00:09:21.274 "uuid": "6a1dfece-cce4-4761-a8c6-12e8e6c10f18", 00:09:21.274 "is_configured": true, 00:09:21.274 "data_offset": 0, 00:09:21.274 "data_size": 65536 00:09:21.274 }, 00:09:21.274 { 00:09:21.274 "name": "BaseBdev2", 00:09:21.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:21.274 "is_configured": false, 00:09:21.274 "data_offset": 0, 00:09:21.274 "data_size": 0 00:09:21.274 } 00:09:21.274 ] 00:09:21.274 }' 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:21.274 13:37:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:21.847 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:22.108 [2024-06-10 13:37:36.426374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:22.108 [2024-06-10 13:37:36.426395] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x141fe00 00:09:22.108 [2024-06-10 13:37:36.426399] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:22.108 [2024-06-10 13:37:36.426554] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14165d0 00:09:22.108 [2024-06-10 13:37:36.426647] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x141fe00 00:09:22.108 [2024-06-10 13:37:36.426653] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x141fe00 00:09:22.108 [2024-06-10 13:37:36.426780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:22.108 BaseBdev2 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:22.108 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:22.368 [ 00:09:22.368 { 00:09:22.368 "name": "BaseBdev2", 00:09:22.368 "aliases": [ 00:09:22.368 "50e91b82-8324-4380-94b7-db8479eea3af" 00:09:22.368 ], 00:09:22.368 "product_name": "Malloc disk", 00:09:22.368 "block_size": 512, 00:09:22.368 "num_blocks": 65536, 00:09:22.368 "uuid": "50e91b82-8324-4380-94b7-db8479eea3af", 00:09:22.368 "assigned_rate_limits": { 00:09:22.368 "rw_ios_per_sec": 0, 00:09:22.368 "rw_mbytes_per_sec": 0, 00:09:22.368 "r_mbytes_per_sec": 0, 00:09:22.368 "w_mbytes_per_sec": 0 00:09:22.368 }, 00:09:22.368 "claimed": true, 00:09:22.368 "claim_type": "exclusive_write", 00:09:22.368 "zoned": false, 00:09:22.368 "supported_io_types": { 00:09:22.368 "read": true, 00:09:22.368 "write": true, 00:09:22.368 "unmap": true, 00:09:22.368 "write_zeroes": true, 00:09:22.368 "flush": true, 00:09:22.368 "reset": true, 00:09:22.368 "compare": false, 00:09:22.368 "compare_and_write": false, 00:09:22.368 "abort": true, 00:09:22.368 "nvme_admin": false, 00:09:22.368 "nvme_io": false 00:09:22.368 }, 00:09:22.368 "memory_domains": [ 00:09:22.368 { 00:09:22.368 "dma_device_id": "system", 00:09:22.368 "dma_device_type": 1 00:09:22.368 }, 00:09:22.368 { 00:09:22.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:22.368 "dma_device_type": 2 00:09:22.368 } 00:09:22.368 ], 00:09:22.368 "driver_specific": {} 00:09:22.368 } 00:09:22.368 ] 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:22.368 13:37:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:22.628 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:22.628 "name": "Existed_Raid", 00:09:22.628 "uuid": "6e316e2e-8c8f-495b-8ac4-ecead1c805aa", 00:09:22.628 "strip_size_kb": 64, 00:09:22.628 "state": "online", 00:09:22.628 "raid_level": "raid0", 00:09:22.628 "superblock": false, 00:09:22.628 "num_base_bdevs": 2, 00:09:22.628 "num_base_bdevs_discovered": 2, 00:09:22.628 "num_base_bdevs_operational": 2, 00:09:22.628 "base_bdevs_list": [ 00:09:22.628 { 00:09:22.628 "name": "BaseBdev1", 00:09:22.628 "uuid": "6a1dfece-cce4-4761-a8c6-12e8e6c10f18", 00:09:22.628 "is_configured": true, 00:09:22.628 "data_offset": 0, 00:09:22.628 "data_size": 65536 00:09:22.628 }, 00:09:22.628 { 00:09:22.628 "name": "BaseBdev2", 00:09:22.628 "uuid": "50e91b82-8324-4380-94b7-db8479eea3af", 00:09:22.628 "is_configured": true, 00:09:22.628 "data_offset": 0, 00:09:22.628 "data_size": 65536 00:09:22.628 } 00:09:22.628 ] 00:09:22.628 }' 00:09:22.628 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:22.628 13:37:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:23.198 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:23.458 [2024-06-10 13:37:37.794037] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:23.458 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:23.458 "name": "Existed_Raid", 00:09:23.458 "aliases": [ 00:09:23.458 "6e316e2e-8c8f-495b-8ac4-ecead1c805aa" 00:09:23.458 ], 00:09:23.458 "product_name": "Raid Volume", 00:09:23.458 "block_size": 512, 00:09:23.458 "num_blocks": 131072, 00:09:23.458 "uuid": "6e316e2e-8c8f-495b-8ac4-ecead1c805aa", 00:09:23.458 "assigned_rate_limits": { 00:09:23.458 "rw_ios_per_sec": 0, 00:09:23.458 "rw_mbytes_per_sec": 0, 00:09:23.458 "r_mbytes_per_sec": 0, 00:09:23.458 "w_mbytes_per_sec": 0 00:09:23.458 }, 00:09:23.458 "claimed": false, 00:09:23.458 "zoned": false, 00:09:23.458 "supported_io_types": { 00:09:23.458 "read": true, 00:09:23.458 "write": true, 00:09:23.458 "unmap": true, 00:09:23.458 "write_zeroes": true, 00:09:23.458 "flush": true, 00:09:23.458 "reset": true, 00:09:23.458 "compare": false, 00:09:23.458 "compare_and_write": false, 00:09:23.458 "abort": false, 00:09:23.458 "nvme_admin": false, 00:09:23.458 "nvme_io": false 00:09:23.458 }, 00:09:23.458 "memory_domains": [ 00:09:23.458 { 00:09:23.458 "dma_device_id": "system", 00:09:23.458 "dma_device_type": 1 00:09:23.458 }, 00:09:23.458 { 00:09:23.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.458 "dma_device_type": 2 00:09:23.458 }, 00:09:23.458 { 00:09:23.458 "dma_device_id": "system", 00:09:23.458 "dma_device_type": 1 00:09:23.458 }, 00:09:23.458 { 00:09:23.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.458 "dma_device_type": 2 00:09:23.458 } 00:09:23.458 ], 00:09:23.458 "driver_specific": { 00:09:23.458 "raid": { 00:09:23.458 "uuid": "6e316e2e-8c8f-495b-8ac4-ecead1c805aa", 00:09:23.458 "strip_size_kb": 64, 00:09:23.458 "state": "online", 00:09:23.458 "raid_level": "raid0", 00:09:23.458 "superblock": false, 00:09:23.458 "num_base_bdevs": 2, 00:09:23.458 "num_base_bdevs_discovered": 2, 00:09:23.458 "num_base_bdevs_operational": 2, 00:09:23.458 "base_bdevs_list": [ 00:09:23.458 { 00:09:23.458 "name": "BaseBdev1", 00:09:23.458 "uuid": "6a1dfece-cce4-4761-a8c6-12e8e6c10f18", 00:09:23.458 "is_configured": true, 00:09:23.458 "data_offset": 0, 00:09:23.458 "data_size": 65536 00:09:23.458 }, 00:09:23.458 { 00:09:23.458 "name": "BaseBdev2", 00:09:23.458 "uuid": "50e91b82-8324-4380-94b7-db8479eea3af", 00:09:23.458 "is_configured": true, 00:09:23.458 "data_offset": 0, 00:09:23.458 "data_size": 65536 00:09:23.458 } 00:09:23.458 ] 00:09:23.458 } 00:09:23.458 } 00:09:23.458 }' 00:09:23.458 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:23.458 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:23.458 BaseBdev2' 00:09:23.458 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:23.458 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:23.458 13:37:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:23.719 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:23.719 "name": "BaseBdev1", 00:09:23.719 "aliases": [ 00:09:23.719 "6a1dfece-cce4-4761-a8c6-12e8e6c10f18" 00:09:23.719 ], 00:09:23.719 "product_name": "Malloc disk", 00:09:23.719 "block_size": 512, 00:09:23.719 "num_blocks": 65536, 00:09:23.719 "uuid": "6a1dfece-cce4-4761-a8c6-12e8e6c10f18", 00:09:23.719 "assigned_rate_limits": { 00:09:23.719 "rw_ios_per_sec": 0, 00:09:23.719 "rw_mbytes_per_sec": 0, 00:09:23.719 "r_mbytes_per_sec": 0, 00:09:23.719 "w_mbytes_per_sec": 0 00:09:23.719 }, 00:09:23.719 "claimed": true, 00:09:23.719 "claim_type": "exclusive_write", 00:09:23.719 "zoned": false, 00:09:23.719 "supported_io_types": { 00:09:23.719 "read": true, 00:09:23.719 "write": true, 00:09:23.719 "unmap": true, 00:09:23.719 "write_zeroes": true, 00:09:23.719 "flush": true, 00:09:23.719 "reset": true, 00:09:23.719 "compare": false, 00:09:23.719 "compare_and_write": false, 00:09:23.719 "abort": true, 00:09:23.719 "nvme_admin": false, 00:09:23.719 "nvme_io": false 00:09:23.719 }, 00:09:23.719 "memory_domains": [ 00:09:23.719 { 00:09:23.719 "dma_device_id": "system", 00:09:23.719 "dma_device_type": 1 00:09:23.719 }, 00:09:23.719 { 00:09:23.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.719 "dma_device_type": 2 00:09:23.719 } 00:09:23.719 ], 00:09:23.719 "driver_specific": {} 00:09:23.719 }' 00:09:23.719 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:23.719 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:23.719 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:23.719 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:23.719 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:23.980 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:24.241 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:24.241 "name": "BaseBdev2", 00:09:24.241 "aliases": [ 00:09:24.241 "50e91b82-8324-4380-94b7-db8479eea3af" 00:09:24.241 ], 00:09:24.241 "product_name": "Malloc disk", 00:09:24.241 "block_size": 512, 00:09:24.241 "num_blocks": 65536, 00:09:24.241 "uuid": "50e91b82-8324-4380-94b7-db8479eea3af", 00:09:24.241 "assigned_rate_limits": { 00:09:24.241 "rw_ios_per_sec": 0, 00:09:24.241 "rw_mbytes_per_sec": 0, 00:09:24.241 "r_mbytes_per_sec": 0, 00:09:24.241 "w_mbytes_per_sec": 0 00:09:24.241 }, 00:09:24.241 "claimed": true, 00:09:24.241 "claim_type": "exclusive_write", 00:09:24.241 "zoned": false, 00:09:24.241 "supported_io_types": { 00:09:24.241 "read": true, 00:09:24.241 "write": true, 00:09:24.241 "unmap": true, 00:09:24.241 "write_zeroes": true, 00:09:24.241 "flush": true, 00:09:24.241 "reset": true, 00:09:24.241 "compare": false, 00:09:24.241 "compare_and_write": false, 00:09:24.241 "abort": true, 00:09:24.241 "nvme_admin": false, 00:09:24.241 "nvme_io": false 00:09:24.241 }, 00:09:24.241 "memory_domains": [ 00:09:24.241 { 00:09:24.241 "dma_device_id": "system", 00:09:24.241 "dma_device_type": 1 00:09:24.241 }, 00:09:24.241 { 00:09:24.241 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:24.241 "dma_device_type": 2 00:09:24.241 } 00:09:24.241 ], 00:09:24.241 "driver_specific": {} 00:09:24.241 }' 00:09:24.241 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:24.241 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:24.241 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:24.241 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:24.501 13:37:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:24.761 [2024-06-10 13:37:39.149316] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:24.761 [2024-06-10 13:37:39.149332] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:24.761 [2024-06-10 13:37:39.149362] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:24.761 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:25.021 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:25.021 "name": "Existed_Raid", 00:09:25.021 "uuid": "6e316e2e-8c8f-495b-8ac4-ecead1c805aa", 00:09:25.021 "strip_size_kb": 64, 00:09:25.021 "state": "offline", 00:09:25.021 "raid_level": "raid0", 00:09:25.021 "superblock": false, 00:09:25.021 "num_base_bdevs": 2, 00:09:25.021 "num_base_bdevs_discovered": 1, 00:09:25.021 "num_base_bdevs_operational": 1, 00:09:25.021 "base_bdevs_list": [ 00:09:25.021 { 00:09:25.021 "name": null, 00:09:25.021 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:25.021 "is_configured": false, 00:09:25.021 "data_offset": 0, 00:09:25.021 "data_size": 65536 00:09:25.021 }, 00:09:25.021 { 00:09:25.021 "name": "BaseBdev2", 00:09:25.021 "uuid": "50e91b82-8324-4380-94b7-db8479eea3af", 00:09:25.021 "is_configured": true, 00:09:25.021 "data_offset": 0, 00:09:25.021 "data_size": 65536 00:09:25.021 } 00:09:25.021 ] 00:09:25.021 }' 00:09:25.021 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:25.021 13:37:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:25.591 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:25.591 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:25.591 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:25.591 13:37:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:25.851 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:25.851 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:25.851 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:26.111 [2024-06-10 13:37:40.332421] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:26.111 [2024-06-10 13:37:40.332454] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141fe00 name Existed_Raid, state offline 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1486184 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1486184 ']' 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1486184 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:26.111 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1486184 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1486184' 00:09:26.373 killing process with pid 1486184 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1486184 00:09:26.373 [2024-06-10 13:37:40.609862] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1486184 00:09:26.373 [2024-06-10 13:37:40.610482] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:26.373 00:09:26.373 real 0m9.286s 00:09:26.373 user 0m16.880s 00:09:26.373 sys 0m1.376s 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:26.373 ************************************ 00:09:26.373 END TEST raid_state_function_test 00:09:26.373 ************************************ 00:09:26.373 13:37:40 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:09:26.373 13:37:40 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:26.373 13:37:40 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:26.373 13:37:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:26.373 ************************************ 00:09:26.373 START TEST raid_state_function_test_sb 00:09:26.373 ************************************ 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 true 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1488374 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1488374' 00:09:26.373 Process raid pid: 1488374 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1488374 /var/tmp/spdk-raid.sock 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1488374 ']' 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:26.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:26.373 13:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:26.633 [2024-06-10 13:37:40.870813] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:26.633 [2024-06-10 13:37:40.870862] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:26.633 [2024-06-10 13:37:40.959818] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:26.633 [2024-06-10 13:37:41.025084] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.633 [2024-06-10 13:37:41.074627] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:26.633 [2024-06-10 13:37:41.074652] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:27.575 [2024-06-10 13:37:41.914614] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:27.575 [2024-06-10 13:37:41.914643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:27.575 [2024-06-10 13:37:41.914649] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:27.575 [2024-06-10 13:37:41.914656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:27.575 13:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:27.840 13:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:27.840 "name": "Existed_Raid", 00:09:27.840 "uuid": "1d9839b7-158a-40e5-8a5c-9e911e245977", 00:09:27.840 "strip_size_kb": 64, 00:09:27.840 "state": "configuring", 00:09:27.840 "raid_level": "raid0", 00:09:27.840 "superblock": true, 00:09:27.840 "num_base_bdevs": 2, 00:09:27.840 "num_base_bdevs_discovered": 0, 00:09:27.840 "num_base_bdevs_operational": 2, 00:09:27.840 "base_bdevs_list": [ 00:09:27.840 { 00:09:27.840 "name": "BaseBdev1", 00:09:27.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:27.840 "is_configured": false, 00:09:27.840 "data_offset": 0, 00:09:27.840 "data_size": 0 00:09:27.840 }, 00:09:27.840 { 00:09:27.840 "name": "BaseBdev2", 00:09:27.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:27.840 "is_configured": false, 00:09:27.840 "data_offset": 0, 00:09:27.840 "data_size": 0 00:09:27.840 } 00:09:27.840 ] 00:09:27.840 }' 00:09:27.840 13:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:27.840 13:37:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:28.409 13:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:28.409 [2024-06-10 13:37:42.828816] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:28.409 [2024-06-10 13:37:42.828832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1536720 name Existed_Raid, state configuring 00:09:28.409 13:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:28.670 [2024-06-10 13:37:43.033356] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:28.670 [2024-06-10 13:37:43.033375] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:28.670 [2024-06-10 13:37:43.033381] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:28.670 [2024-06-10 13:37:43.033387] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:28.670 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:28.930 [2024-06-10 13:37:43.236624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:28.930 BaseBdev1 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:28.930 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:29.191 [ 00:09:29.191 { 00:09:29.191 "name": "BaseBdev1", 00:09:29.191 "aliases": [ 00:09:29.191 "97624ef6-ee7b-4736-9eab-bc0864e8ef91" 00:09:29.191 ], 00:09:29.191 "product_name": "Malloc disk", 00:09:29.191 "block_size": 512, 00:09:29.191 "num_blocks": 65536, 00:09:29.191 "uuid": "97624ef6-ee7b-4736-9eab-bc0864e8ef91", 00:09:29.191 "assigned_rate_limits": { 00:09:29.191 "rw_ios_per_sec": 0, 00:09:29.191 "rw_mbytes_per_sec": 0, 00:09:29.191 "r_mbytes_per_sec": 0, 00:09:29.191 "w_mbytes_per_sec": 0 00:09:29.191 }, 00:09:29.191 "claimed": true, 00:09:29.191 "claim_type": "exclusive_write", 00:09:29.191 "zoned": false, 00:09:29.191 "supported_io_types": { 00:09:29.191 "read": true, 00:09:29.191 "write": true, 00:09:29.191 "unmap": true, 00:09:29.191 "write_zeroes": true, 00:09:29.191 "flush": true, 00:09:29.191 "reset": true, 00:09:29.191 "compare": false, 00:09:29.191 "compare_and_write": false, 00:09:29.191 "abort": true, 00:09:29.191 "nvme_admin": false, 00:09:29.191 "nvme_io": false 00:09:29.191 }, 00:09:29.191 "memory_domains": [ 00:09:29.191 { 00:09:29.191 "dma_device_id": "system", 00:09:29.191 "dma_device_type": 1 00:09:29.191 }, 00:09:29.191 { 00:09:29.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:29.191 "dma_device_type": 2 00:09:29.191 } 00:09:29.191 ], 00:09:29.191 "driver_specific": {} 00:09:29.191 } 00:09:29.191 ] 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:29.191 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:29.452 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:29.452 "name": "Existed_Raid", 00:09:29.452 "uuid": "6997da6f-d68c-4837-866e-2e5f5a479fb1", 00:09:29.452 "strip_size_kb": 64, 00:09:29.452 "state": "configuring", 00:09:29.452 "raid_level": "raid0", 00:09:29.452 "superblock": true, 00:09:29.452 "num_base_bdevs": 2, 00:09:29.452 "num_base_bdevs_discovered": 1, 00:09:29.452 "num_base_bdevs_operational": 2, 00:09:29.452 "base_bdevs_list": [ 00:09:29.452 { 00:09:29.452 "name": "BaseBdev1", 00:09:29.452 "uuid": "97624ef6-ee7b-4736-9eab-bc0864e8ef91", 00:09:29.452 "is_configured": true, 00:09:29.452 "data_offset": 2048, 00:09:29.452 "data_size": 63488 00:09:29.452 }, 00:09:29.452 { 00:09:29.452 "name": "BaseBdev2", 00:09:29.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:29.452 "is_configured": false, 00:09:29.452 "data_offset": 0, 00:09:29.452 "data_size": 0 00:09:29.452 } 00:09:29.452 ] 00:09:29.452 }' 00:09:29.452 13:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:29.452 13:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:30.022 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:30.283 [2024-06-10 13:37:44.584038] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:30.283 [2024-06-10 13:37:44.584064] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1536010 name Existed_Raid, state configuring 00:09:30.283 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:30.543 [2024-06-10 13:37:44.788589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:30.543 [2024-06-10 13:37:44.789817] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:30.543 [2024-06-10 13:37:44.789841] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:30.543 13:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:30.543 13:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:30.543 "name": "Existed_Raid", 00:09:30.543 "uuid": "e9bbd8ca-0ead-4112-a36c-b49b64d2767f", 00:09:30.543 "strip_size_kb": 64, 00:09:30.543 "state": "configuring", 00:09:30.543 "raid_level": "raid0", 00:09:30.543 "superblock": true, 00:09:30.543 "num_base_bdevs": 2, 00:09:30.543 "num_base_bdevs_discovered": 1, 00:09:30.543 "num_base_bdevs_operational": 2, 00:09:30.543 "base_bdevs_list": [ 00:09:30.543 { 00:09:30.543 "name": "BaseBdev1", 00:09:30.543 "uuid": "97624ef6-ee7b-4736-9eab-bc0864e8ef91", 00:09:30.543 "is_configured": true, 00:09:30.544 "data_offset": 2048, 00:09:30.544 "data_size": 63488 00:09:30.544 }, 00:09:30.544 { 00:09:30.544 "name": "BaseBdev2", 00:09:30.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:30.544 "is_configured": false, 00:09:30.544 "data_offset": 0, 00:09:30.544 "data_size": 0 00:09:30.544 } 00:09:30.544 ] 00:09:30.544 }' 00:09:30.544 13:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:30.544 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:31.114 13:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:31.374 [2024-06-10 13:37:45.772288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:31.374 [2024-06-10 13:37:45.772395] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1536e00 00:09:31.374 [2024-06-10 13:37:45.772403] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:31.374 [2024-06-10 13:37:45.772553] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15380e0 00:09:31.374 [2024-06-10 13:37:45.772641] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1536e00 00:09:31.374 [2024-06-10 13:37:45.772648] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1536e00 00:09:31.374 [2024-06-10 13:37:45.772722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:31.374 BaseBdev2 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:31.374 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:31.635 13:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:31.896 [ 00:09:31.896 { 00:09:31.896 "name": "BaseBdev2", 00:09:31.896 "aliases": [ 00:09:31.896 "fb2eba6b-a588-4145-96b5-7420c74d6fe3" 00:09:31.896 ], 00:09:31.896 "product_name": "Malloc disk", 00:09:31.896 "block_size": 512, 00:09:31.896 "num_blocks": 65536, 00:09:31.896 "uuid": "fb2eba6b-a588-4145-96b5-7420c74d6fe3", 00:09:31.896 "assigned_rate_limits": { 00:09:31.896 "rw_ios_per_sec": 0, 00:09:31.896 "rw_mbytes_per_sec": 0, 00:09:31.896 "r_mbytes_per_sec": 0, 00:09:31.896 "w_mbytes_per_sec": 0 00:09:31.896 }, 00:09:31.896 "claimed": true, 00:09:31.896 "claim_type": "exclusive_write", 00:09:31.896 "zoned": false, 00:09:31.896 "supported_io_types": { 00:09:31.896 "read": true, 00:09:31.896 "write": true, 00:09:31.896 "unmap": true, 00:09:31.896 "write_zeroes": true, 00:09:31.896 "flush": true, 00:09:31.896 "reset": true, 00:09:31.896 "compare": false, 00:09:31.896 "compare_and_write": false, 00:09:31.896 "abort": true, 00:09:31.896 "nvme_admin": false, 00:09:31.896 "nvme_io": false 00:09:31.896 }, 00:09:31.896 "memory_domains": [ 00:09:31.896 { 00:09:31.896 "dma_device_id": "system", 00:09:31.896 "dma_device_type": 1 00:09:31.896 }, 00:09:31.896 { 00:09:31.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:31.896 "dma_device_type": 2 00:09:31.896 } 00:09:31.896 ], 00:09:31.896 "driver_specific": {} 00:09:31.896 } 00:09:31.896 ] 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:31.896 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:32.156 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:32.156 "name": "Existed_Raid", 00:09:32.156 "uuid": "e9bbd8ca-0ead-4112-a36c-b49b64d2767f", 00:09:32.156 "strip_size_kb": 64, 00:09:32.156 "state": "online", 00:09:32.156 "raid_level": "raid0", 00:09:32.157 "superblock": true, 00:09:32.157 "num_base_bdevs": 2, 00:09:32.157 "num_base_bdevs_discovered": 2, 00:09:32.157 "num_base_bdevs_operational": 2, 00:09:32.157 "base_bdevs_list": [ 00:09:32.157 { 00:09:32.157 "name": "BaseBdev1", 00:09:32.157 "uuid": "97624ef6-ee7b-4736-9eab-bc0864e8ef91", 00:09:32.157 "is_configured": true, 00:09:32.157 "data_offset": 2048, 00:09:32.157 "data_size": 63488 00:09:32.157 }, 00:09:32.157 { 00:09:32.157 "name": "BaseBdev2", 00:09:32.157 "uuid": "fb2eba6b-a588-4145-96b5-7420c74d6fe3", 00:09:32.157 "is_configured": true, 00:09:32.157 "data_offset": 2048, 00:09:32.157 "data_size": 63488 00:09:32.157 } 00:09:32.157 ] 00:09:32.157 }' 00:09:32.157 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:32.157 13:37:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:32.726 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:32.726 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:32.726 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:32.726 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:32.726 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:32.726 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:32.727 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:32.727 13:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:32.727 [2024-06-10 13:37:47.127931] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:32.727 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:32.727 "name": "Existed_Raid", 00:09:32.727 "aliases": [ 00:09:32.727 "e9bbd8ca-0ead-4112-a36c-b49b64d2767f" 00:09:32.727 ], 00:09:32.727 "product_name": "Raid Volume", 00:09:32.727 "block_size": 512, 00:09:32.727 "num_blocks": 126976, 00:09:32.727 "uuid": "e9bbd8ca-0ead-4112-a36c-b49b64d2767f", 00:09:32.727 "assigned_rate_limits": { 00:09:32.727 "rw_ios_per_sec": 0, 00:09:32.727 "rw_mbytes_per_sec": 0, 00:09:32.727 "r_mbytes_per_sec": 0, 00:09:32.727 "w_mbytes_per_sec": 0 00:09:32.727 }, 00:09:32.727 "claimed": false, 00:09:32.727 "zoned": false, 00:09:32.727 "supported_io_types": { 00:09:32.727 "read": true, 00:09:32.727 "write": true, 00:09:32.727 "unmap": true, 00:09:32.727 "write_zeroes": true, 00:09:32.727 "flush": true, 00:09:32.727 "reset": true, 00:09:32.727 "compare": false, 00:09:32.727 "compare_and_write": false, 00:09:32.727 "abort": false, 00:09:32.727 "nvme_admin": false, 00:09:32.727 "nvme_io": false 00:09:32.727 }, 00:09:32.727 "memory_domains": [ 00:09:32.727 { 00:09:32.727 "dma_device_id": "system", 00:09:32.727 "dma_device_type": 1 00:09:32.727 }, 00:09:32.727 { 00:09:32.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:32.727 "dma_device_type": 2 00:09:32.727 }, 00:09:32.727 { 00:09:32.727 "dma_device_id": "system", 00:09:32.727 "dma_device_type": 1 00:09:32.727 }, 00:09:32.727 { 00:09:32.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:32.727 "dma_device_type": 2 00:09:32.727 } 00:09:32.727 ], 00:09:32.727 "driver_specific": { 00:09:32.727 "raid": { 00:09:32.727 "uuid": "e9bbd8ca-0ead-4112-a36c-b49b64d2767f", 00:09:32.727 "strip_size_kb": 64, 00:09:32.727 "state": "online", 00:09:32.727 "raid_level": "raid0", 00:09:32.727 "superblock": true, 00:09:32.727 "num_base_bdevs": 2, 00:09:32.727 "num_base_bdevs_discovered": 2, 00:09:32.727 "num_base_bdevs_operational": 2, 00:09:32.727 "base_bdevs_list": [ 00:09:32.727 { 00:09:32.727 "name": "BaseBdev1", 00:09:32.727 "uuid": "97624ef6-ee7b-4736-9eab-bc0864e8ef91", 00:09:32.727 "is_configured": true, 00:09:32.727 "data_offset": 2048, 00:09:32.727 "data_size": 63488 00:09:32.727 }, 00:09:32.727 { 00:09:32.727 "name": "BaseBdev2", 00:09:32.727 "uuid": "fb2eba6b-a588-4145-96b5-7420c74d6fe3", 00:09:32.727 "is_configured": true, 00:09:32.727 "data_offset": 2048, 00:09:32.727 "data_size": 63488 00:09:32.727 } 00:09:32.727 ] 00:09:32.727 } 00:09:32.727 } 00:09:32.727 }' 00:09:32.727 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:32.727 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:32.727 BaseBdev2' 00:09:32.727 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:32.727 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:32.727 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:32.986 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:32.986 "name": "BaseBdev1", 00:09:32.986 "aliases": [ 00:09:32.986 "97624ef6-ee7b-4736-9eab-bc0864e8ef91" 00:09:32.986 ], 00:09:32.986 "product_name": "Malloc disk", 00:09:32.986 "block_size": 512, 00:09:32.986 "num_blocks": 65536, 00:09:32.986 "uuid": "97624ef6-ee7b-4736-9eab-bc0864e8ef91", 00:09:32.986 "assigned_rate_limits": { 00:09:32.986 "rw_ios_per_sec": 0, 00:09:32.986 "rw_mbytes_per_sec": 0, 00:09:32.986 "r_mbytes_per_sec": 0, 00:09:32.986 "w_mbytes_per_sec": 0 00:09:32.986 }, 00:09:32.986 "claimed": true, 00:09:32.986 "claim_type": "exclusive_write", 00:09:32.986 "zoned": false, 00:09:32.986 "supported_io_types": { 00:09:32.986 "read": true, 00:09:32.986 "write": true, 00:09:32.986 "unmap": true, 00:09:32.986 "write_zeroes": true, 00:09:32.986 "flush": true, 00:09:32.987 "reset": true, 00:09:32.987 "compare": false, 00:09:32.987 "compare_and_write": false, 00:09:32.987 "abort": true, 00:09:32.987 "nvme_admin": false, 00:09:32.987 "nvme_io": false 00:09:32.987 }, 00:09:32.987 "memory_domains": [ 00:09:32.987 { 00:09:32.987 "dma_device_id": "system", 00:09:32.987 "dma_device_type": 1 00:09:32.987 }, 00:09:32.987 { 00:09:32.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:32.987 "dma_device_type": 2 00:09:32.987 } 00:09:32.987 ], 00:09:32.987 "driver_specific": {} 00:09:32.987 }' 00:09:32.987 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:32.987 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:33.247 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:33.507 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:33.507 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:33.507 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:33.507 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:33.507 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:33.507 "name": "BaseBdev2", 00:09:33.507 "aliases": [ 00:09:33.507 "fb2eba6b-a588-4145-96b5-7420c74d6fe3" 00:09:33.507 ], 00:09:33.507 "product_name": "Malloc disk", 00:09:33.507 "block_size": 512, 00:09:33.507 "num_blocks": 65536, 00:09:33.507 "uuid": "fb2eba6b-a588-4145-96b5-7420c74d6fe3", 00:09:33.507 "assigned_rate_limits": { 00:09:33.507 "rw_ios_per_sec": 0, 00:09:33.507 "rw_mbytes_per_sec": 0, 00:09:33.507 "r_mbytes_per_sec": 0, 00:09:33.507 "w_mbytes_per_sec": 0 00:09:33.507 }, 00:09:33.507 "claimed": true, 00:09:33.507 "claim_type": "exclusive_write", 00:09:33.507 "zoned": false, 00:09:33.507 "supported_io_types": { 00:09:33.507 "read": true, 00:09:33.507 "write": true, 00:09:33.507 "unmap": true, 00:09:33.507 "write_zeroes": true, 00:09:33.507 "flush": true, 00:09:33.507 "reset": true, 00:09:33.507 "compare": false, 00:09:33.507 "compare_and_write": false, 00:09:33.507 "abort": true, 00:09:33.507 "nvme_admin": false, 00:09:33.507 "nvme_io": false 00:09:33.507 }, 00:09:33.507 "memory_domains": [ 00:09:33.507 { 00:09:33.507 "dma_device_id": "system", 00:09:33.507 "dma_device_type": 1 00:09:33.507 }, 00:09:33.507 { 00:09:33.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:33.507 "dma_device_type": 2 00:09:33.507 } 00:09:33.507 ], 00:09:33.507 "driver_specific": {} 00:09:33.507 }' 00:09:33.507 13:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:33.767 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:33.768 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:34.028 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:34.028 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:34.028 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:34.289 [2024-06-10 13:37:48.511271] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:34.289 [2024-06-10 13:37:48.511288] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:34.289 [2024-06-10 13:37:48.511321] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:34.289 "name": "Existed_Raid", 00:09:34.289 "uuid": "e9bbd8ca-0ead-4112-a36c-b49b64d2767f", 00:09:34.289 "strip_size_kb": 64, 00:09:34.289 "state": "offline", 00:09:34.289 "raid_level": "raid0", 00:09:34.289 "superblock": true, 00:09:34.289 "num_base_bdevs": 2, 00:09:34.289 "num_base_bdevs_discovered": 1, 00:09:34.289 "num_base_bdevs_operational": 1, 00:09:34.289 "base_bdevs_list": [ 00:09:34.289 { 00:09:34.289 "name": null, 00:09:34.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:34.289 "is_configured": false, 00:09:34.289 "data_offset": 2048, 00:09:34.289 "data_size": 63488 00:09:34.289 }, 00:09:34.289 { 00:09:34.289 "name": "BaseBdev2", 00:09:34.289 "uuid": "fb2eba6b-a588-4145-96b5-7420c74d6fe3", 00:09:34.289 "is_configured": true, 00:09:34.289 "data_offset": 2048, 00:09:34.289 "data_size": 63488 00:09:34.289 } 00:09:34.289 ] 00:09:34.289 }' 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:34.289 13:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:34.860 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:34.860 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:34.860 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:34.860 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:35.121 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:35.121 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:35.121 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:35.380 [2024-06-10 13:37:49.658207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:35.380 [2024-06-10 13:37:49.658241] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1536e00 name Existed_Raid, state offline 00:09:35.380 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:35.380 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:35.380 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:35.380 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1488374 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1488374 ']' 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1488374 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1488374 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1488374' 00:09:35.641 killing process with pid 1488374 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1488374 00:09:35.641 [2024-06-10 13:37:49.954100] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:35.641 13:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1488374 00:09:35.641 [2024-06-10 13:37:49.954715] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:35.641 13:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:35.641 00:09:35.641 real 0m9.269s 00:09:35.641 user 0m16.838s 00:09:35.641 sys 0m1.402s 00:09:35.641 13:37:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:35.641 13:37:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:35.641 ************************************ 00:09:35.641 END TEST raid_state_function_test_sb 00:09:35.641 ************************************ 00:09:35.903 13:37:50 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:09:35.903 13:37:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:09:35.903 13:37:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:35.903 13:37:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:35.903 ************************************ 00:09:35.903 START TEST raid_superblock_test 00:09:35.903 ************************************ 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 2 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1490384 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1490384 /var/tmp/spdk-raid.sock 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1490384 ']' 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:35.903 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:35.903 13:37:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:35.903 [2024-06-10 13:37:50.214845] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:35.903 [2024-06-10 13:37:50.214892] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1490384 ] 00:09:35.903 [2024-06-10 13:37:50.302540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.903 [2024-06-10 13:37:50.367722] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.163 [2024-06-10 13:37:50.416510] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:36.163 [2024-06-10 13:37:50.416534] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:36.732 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:36.992 malloc1 00:09:36.992 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:37.252 [2024-06-10 13:37:51.471913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:37.252 [2024-06-10 13:37:51.471949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:37.252 [2024-06-10 13:37:51.471962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb5550 00:09:37.252 [2024-06-10 13:37:51.471969] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:37.252 [2024-06-10 13:37:51.473317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:37.252 [2024-06-10 13:37:51.473336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:37.252 pt1 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:37.252 malloc2 00:09:37.252 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:37.513 [2024-06-10 13:37:51.895096] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:37.513 [2024-06-10 13:37:51.895127] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:37.513 [2024-06-10 13:37:51.895138] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c770f0 00:09:37.513 [2024-06-10 13:37:51.895145] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:37.513 [2024-06-10 13:37:51.896424] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:37.513 [2024-06-10 13:37:51.896444] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:37.513 pt2 00:09:37.513 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:37.513 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:37.513 13:37:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:09:37.774 [2024-06-10 13:37:52.095611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:37.774 [2024-06-10 13:37:52.096684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:37.774 [2024-06-10 13:37:52.096798] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c81690 00:09:37.774 [2024-06-10 13:37:52.096807] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:37.774 [2024-06-10 13:37:52.096966] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c792e0 00:09:37.774 [2024-06-10 13:37:52.097085] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c81690 00:09:37.774 [2024-06-10 13:37:52.097091] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c81690 00:09:37.774 [2024-06-10 13:37:52.097175] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:37.774 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:38.034 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:38.034 "name": "raid_bdev1", 00:09:38.034 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:38.034 "strip_size_kb": 64, 00:09:38.034 "state": "online", 00:09:38.034 "raid_level": "raid0", 00:09:38.034 "superblock": true, 00:09:38.034 "num_base_bdevs": 2, 00:09:38.034 "num_base_bdevs_discovered": 2, 00:09:38.034 "num_base_bdevs_operational": 2, 00:09:38.034 "base_bdevs_list": [ 00:09:38.034 { 00:09:38.034 "name": "pt1", 00:09:38.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:38.034 "is_configured": true, 00:09:38.034 "data_offset": 2048, 00:09:38.034 "data_size": 63488 00:09:38.034 }, 00:09:38.034 { 00:09:38.034 "name": "pt2", 00:09:38.034 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:38.035 "is_configured": true, 00:09:38.035 "data_offset": 2048, 00:09:38.035 "data_size": 63488 00:09:38.035 } 00:09:38.035 ] 00:09:38.035 }' 00:09:38.035 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:38.035 13:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:38.645 13:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:38.645 [2024-06-10 13:37:53.074259] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:38.645 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:38.645 "name": "raid_bdev1", 00:09:38.645 "aliases": [ 00:09:38.645 "99ed2251-05fe-4a09-a761-92664dabc1cf" 00:09:38.645 ], 00:09:38.645 "product_name": "Raid Volume", 00:09:38.645 "block_size": 512, 00:09:38.645 "num_blocks": 126976, 00:09:38.645 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:38.645 "assigned_rate_limits": { 00:09:38.645 "rw_ios_per_sec": 0, 00:09:38.645 "rw_mbytes_per_sec": 0, 00:09:38.645 "r_mbytes_per_sec": 0, 00:09:38.645 "w_mbytes_per_sec": 0 00:09:38.645 }, 00:09:38.645 "claimed": false, 00:09:38.645 "zoned": false, 00:09:38.645 "supported_io_types": { 00:09:38.645 "read": true, 00:09:38.646 "write": true, 00:09:38.646 "unmap": true, 00:09:38.646 "write_zeroes": true, 00:09:38.646 "flush": true, 00:09:38.646 "reset": true, 00:09:38.646 "compare": false, 00:09:38.646 "compare_and_write": false, 00:09:38.646 "abort": false, 00:09:38.646 "nvme_admin": false, 00:09:38.646 "nvme_io": false 00:09:38.646 }, 00:09:38.646 "memory_domains": [ 00:09:38.646 { 00:09:38.646 "dma_device_id": "system", 00:09:38.646 "dma_device_type": 1 00:09:38.646 }, 00:09:38.646 { 00:09:38.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:38.646 "dma_device_type": 2 00:09:38.646 }, 00:09:38.646 { 00:09:38.646 "dma_device_id": "system", 00:09:38.646 "dma_device_type": 1 00:09:38.646 }, 00:09:38.646 { 00:09:38.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:38.646 "dma_device_type": 2 00:09:38.646 } 00:09:38.646 ], 00:09:38.646 "driver_specific": { 00:09:38.646 "raid": { 00:09:38.646 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:38.646 "strip_size_kb": 64, 00:09:38.646 "state": "online", 00:09:38.646 "raid_level": "raid0", 00:09:38.646 "superblock": true, 00:09:38.646 "num_base_bdevs": 2, 00:09:38.646 "num_base_bdevs_discovered": 2, 00:09:38.646 "num_base_bdevs_operational": 2, 00:09:38.646 "base_bdevs_list": [ 00:09:38.646 { 00:09:38.646 "name": "pt1", 00:09:38.646 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:38.646 "is_configured": true, 00:09:38.646 "data_offset": 2048, 00:09:38.646 "data_size": 63488 00:09:38.646 }, 00:09:38.646 { 00:09:38.646 "name": "pt2", 00:09:38.646 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:38.646 "is_configured": true, 00:09:38.646 "data_offset": 2048, 00:09:38.646 "data_size": 63488 00:09:38.646 } 00:09:38.646 ] 00:09:38.646 } 00:09:38.646 } 00:09:38.646 }' 00:09:38.646 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:38.961 pt2' 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:38.961 "name": "pt1", 00:09:38.961 "aliases": [ 00:09:38.961 "00000000-0000-0000-0000-000000000001" 00:09:38.961 ], 00:09:38.961 "product_name": "passthru", 00:09:38.961 "block_size": 512, 00:09:38.961 "num_blocks": 65536, 00:09:38.961 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:38.961 "assigned_rate_limits": { 00:09:38.961 "rw_ios_per_sec": 0, 00:09:38.961 "rw_mbytes_per_sec": 0, 00:09:38.961 "r_mbytes_per_sec": 0, 00:09:38.961 "w_mbytes_per_sec": 0 00:09:38.961 }, 00:09:38.961 "claimed": true, 00:09:38.961 "claim_type": "exclusive_write", 00:09:38.961 "zoned": false, 00:09:38.961 "supported_io_types": { 00:09:38.961 "read": true, 00:09:38.961 "write": true, 00:09:38.961 "unmap": true, 00:09:38.961 "write_zeroes": true, 00:09:38.961 "flush": true, 00:09:38.961 "reset": true, 00:09:38.961 "compare": false, 00:09:38.961 "compare_and_write": false, 00:09:38.961 "abort": true, 00:09:38.961 "nvme_admin": false, 00:09:38.961 "nvme_io": false 00:09:38.961 }, 00:09:38.961 "memory_domains": [ 00:09:38.961 { 00:09:38.961 "dma_device_id": "system", 00:09:38.961 "dma_device_type": 1 00:09:38.961 }, 00:09:38.961 { 00:09:38.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:38.961 "dma_device_type": 2 00:09:38.961 } 00:09:38.961 ], 00:09:38.961 "driver_specific": { 00:09:38.961 "passthru": { 00:09:38.961 "name": "pt1", 00:09:38.961 "base_bdev_name": "malloc1" 00:09:38.961 } 00:09:38.961 } 00:09:38.961 }' 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:38.961 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:39.222 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:39.483 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:39.483 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:39.483 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:39.483 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:39.483 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:39.483 "name": "pt2", 00:09:39.483 "aliases": [ 00:09:39.483 "00000000-0000-0000-0000-000000000002" 00:09:39.483 ], 00:09:39.483 "product_name": "passthru", 00:09:39.483 "block_size": 512, 00:09:39.483 "num_blocks": 65536, 00:09:39.483 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:39.483 "assigned_rate_limits": { 00:09:39.483 "rw_ios_per_sec": 0, 00:09:39.483 "rw_mbytes_per_sec": 0, 00:09:39.483 "r_mbytes_per_sec": 0, 00:09:39.483 "w_mbytes_per_sec": 0 00:09:39.483 }, 00:09:39.483 "claimed": true, 00:09:39.483 "claim_type": "exclusive_write", 00:09:39.483 "zoned": false, 00:09:39.483 "supported_io_types": { 00:09:39.483 "read": true, 00:09:39.483 "write": true, 00:09:39.483 "unmap": true, 00:09:39.483 "write_zeroes": true, 00:09:39.483 "flush": true, 00:09:39.483 "reset": true, 00:09:39.483 "compare": false, 00:09:39.483 "compare_and_write": false, 00:09:39.483 "abort": true, 00:09:39.483 "nvme_admin": false, 00:09:39.483 "nvme_io": false 00:09:39.483 }, 00:09:39.483 "memory_domains": [ 00:09:39.483 { 00:09:39.483 "dma_device_id": "system", 00:09:39.483 "dma_device_type": 1 00:09:39.484 }, 00:09:39.484 { 00:09:39.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:39.484 "dma_device_type": 2 00:09:39.484 } 00:09:39.484 ], 00:09:39.484 "driver_specific": { 00:09:39.484 "passthru": { 00:09:39.484 "name": "pt2", 00:09:39.484 "base_bdev_name": "malloc2" 00:09:39.484 } 00:09:39.484 } 00:09:39.484 }' 00:09:39.484 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:39.484 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:39.745 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:39.745 13:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:39.745 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:40.006 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:40.006 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:09:40.006 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:40.006 [2024-06-10 13:37:54.441730] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:40.006 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=99ed2251-05fe-4a09-a761-92664dabc1cf 00:09:40.006 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 99ed2251-05fe-4a09-a761-92664dabc1cf ']' 00:09:40.006 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:40.266 [2024-06-10 13:37:54.642051] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:40.266 [2024-06-10 13:37:54.642062] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:40.266 [2024-06-10 13:37:54.642103] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:40.266 [2024-06-10 13:37:54.642138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:40.266 [2024-06-10 13:37:54.642144] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c81690 name raid_bdev1, state offline 00:09:40.266 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:40.266 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:09:40.526 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:09:40.526 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:09:40.526 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:40.526 13:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:40.797 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:40.797 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:40.797 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:40.797 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:41.061 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:41.322 [2024-06-10 13:37:55.652584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:41.322 [2024-06-10 13:37:55.653717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:41.322 [2024-06-10 13:37:55.653759] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:41.322 [2024-06-10 13:37:55.653788] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:41.322 [2024-06-10 13:37:55.653799] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:41.322 [2024-06-10 13:37:55.653804] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb82e0 name raid_bdev1, state configuring 00:09:41.322 request: 00:09:41.322 { 00:09:41.322 "name": "raid_bdev1", 00:09:41.322 "raid_level": "raid0", 00:09:41.322 "base_bdevs": [ 00:09:41.322 "malloc1", 00:09:41.322 "malloc2" 00:09:41.322 ], 00:09:41.322 "superblock": false, 00:09:41.322 "strip_size_kb": 64, 00:09:41.322 "method": "bdev_raid_create", 00:09:41.322 "req_id": 1 00:09:41.322 } 00:09:41.322 Got JSON-RPC error response 00:09:41.322 response: 00:09:41.322 { 00:09:41.322 "code": -17, 00:09:41.322 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:41.322 } 00:09:41.322 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:09:41.322 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:09:41.322 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:09:41.322 13:37:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:09:41.322 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:41.322 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:09:41.583 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:09:41.583 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:09:41.583 13:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:41.844 [2024-06-10 13:37:56.061562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:41.844 [2024-06-10 13:37:56.061586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:41.844 [2024-06-10 13:37:56.061596] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb87c0 00:09:41.844 [2024-06-10 13:37:56.061603] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:41.844 [2024-06-10 13:37:56.062929] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:41.844 [2024-06-10 13:37:56.062948] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:41.844 [2024-06-10 13:37:56.062994] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:41.844 [2024-06-10 13:37:56.063012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:41.844 pt1 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:41.844 "name": "raid_bdev1", 00:09:41.844 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:41.844 "strip_size_kb": 64, 00:09:41.844 "state": "configuring", 00:09:41.844 "raid_level": "raid0", 00:09:41.844 "superblock": true, 00:09:41.844 "num_base_bdevs": 2, 00:09:41.844 "num_base_bdevs_discovered": 1, 00:09:41.844 "num_base_bdevs_operational": 2, 00:09:41.844 "base_bdevs_list": [ 00:09:41.844 { 00:09:41.844 "name": "pt1", 00:09:41.844 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:41.844 "is_configured": true, 00:09:41.844 "data_offset": 2048, 00:09:41.844 "data_size": 63488 00:09:41.844 }, 00:09:41.844 { 00:09:41.844 "name": null, 00:09:41.844 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:41.844 "is_configured": false, 00:09:41.844 "data_offset": 2048, 00:09:41.844 "data_size": 63488 00:09:41.844 } 00:09:41.844 ] 00:09:41.844 }' 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:41.844 13:37:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:42.414 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:09:42.414 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:09:42.414 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:42.414 13:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:42.675 [2024-06-10 13:37:57.015998] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:42.675 [2024-06-10 13:37:57.016028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:42.675 [2024-06-10 13:37:57.016038] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c77320 00:09:42.675 [2024-06-10 13:37:57.016045] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:42.675 [2024-06-10 13:37:57.016335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:42.675 [2024-06-10 13:37:57.016347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:42.675 [2024-06-10 13:37:57.016391] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:42.675 [2024-06-10 13:37:57.016403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:42.675 [2024-06-10 13:37:57.016479] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bb7c50 00:09:42.675 [2024-06-10 13:37:57.016485] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:42.675 [2024-06-10 13:37:57.016623] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb5220 00:09:42.675 [2024-06-10 13:37:57.016724] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bb7c50 00:09:42.675 [2024-06-10 13:37:57.016730] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bb7c50 00:09:42.675 [2024-06-10 13:37:57.016807] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:42.675 pt2 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:42.675 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:42.936 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:42.936 "name": "raid_bdev1", 00:09:42.936 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:42.936 "strip_size_kb": 64, 00:09:42.936 "state": "online", 00:09:42.936 "raid_level": "raid0", 00:09:42.936 "superblock": true, 00:09:42.936 "num_base_bdevs": 2, 00:09:42.936 "num_base_bdevs_discovered": 2, 00:09:42.936 "num_base_bdevs_operational": 2, 00:09:42.936 "base_bdevs_list": [ 00:09:42.936 { 00:09:42.936 "name": "pt1", 00:09:42.936 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:42.936 "is_configured": true, 00:09:42.936 "data_offset": 2048, 00:09:42.936 "data_size": 63488 00:09:42.936 }, 00:09:42.936 { 00:09:42.936 "name": "pt2", 00:09:42.936 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:42.936 "is_configured": true, 00:09:42.936 "data_offset": 2048, 00:09:42.936 "data_size": 63488 00:09:42.936 } 00:09:42.936 ] 00:09:42.936 }' 00:09:42.936 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:42.936 13:37:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:43.507 13:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:43.767 [2024-06-10 13:37:57.986641] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:43.767 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:43.767 "name": "raid_bdev1", 00:09:43.767 "aliases": [ 00:09:43.767 "99ed2251-05fe-4a09-a761-92664dabc1cf" 00:09:43.767 ], 00:09:43.767 "product_name": "Raid Volume", 00:09:43.767 "block_size": 512, 00:09:43.767 "num_blocks": 126976, 00:09:43.767 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:43.767 "assigned_rate_limits": { 00:09:43.767 "rw_ios_per_sec": 0, 00:09:43.767 "rw_mbytes_per_sec": 0, 00:09:43.767 "r_mbytes_per_sec": 0, 00:09:43.767 "w_mbytes_per_sec": 0 00:09:43.767 }, 00:09:43.767 "claimed": false, 00:09:43.768 "zoned": false, 00:09:43.768 "supported_io_types": { 00:09:43.768 "read": true, 00:09:43.768 "write": true, 00:09:43.768 "unmap": true, 00:09:43.768 "write_zeroes": true, 00:09:43.768 "flush": true, 00:09:43.768 "reset": true, 00:09:43.768 "compare": false, 00:09:43.768 "compare_and_write": false, 00:09:43.768 "abort": false, 00:09:43.768 "nvme_admin": false, 00:09:43.768 "nvme_io": false 00:09:43.768 }, 00:09:43.768 "memory_domains": [ 00:09:43.768 { 00:09:43.768 "dma_device_id": "system", 00:09:43.768 "dma_device_type": 1 00:09:43.768 }, 00:09:43.768 { 00:09:43.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:43.768 "dma_device_type": 2 00:09:43.768 }, 00:09:43.768 { 00:09:43.768 "dma_device_id": "system", 00:09:43.768 "dma_device_type": 1 00:09:43.768 }, 00:09:43.768 { 00:09:43.768 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:43.768 "dma_device_type": 2 00:09:43.768 } 00:09:43.768 ], 00:09:43.768 "driver_specific": { 00:09:43.768 "raid": { 00:09:43.768 "uuid": "99ed2251-05fe-4a09-a761-92664dabc1cf", 00:09:43.768 "strip_size_kb": 64, 00:09:43.768 "state": "online", 00:09:43.768 "raid_level": "raid0", 00:09:43.768 "superblock": true, 00:09:43.768 "num_base_bdevs": 2, 00:09:43.768 "num_base_bdevs_discovered": 2, 00:09:43.768 "num_base_bdevs_operational": 2, 00:09:43.768 "base_bdevs_list": [ 00:09:43.768 { 00:09:43.768 "name": "pt1", 00:09:43.768 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:43.768 "is_configured": true, 00:09:43.768 "data_offset": 2048, 00:09:43.768 "data_size": 63488 00:09:43.768 }, 00:09:43.768 { 00:09:43.768 "name": "pt2", 00:09:43.768 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:43.768 "is_configured": true, 00:09:43.768 "data_offset": 2048, 00:09:43.768 "data_size": 63488 00:09:43.768 } 00:09:43.768 ] 00:09:43.768 } 00:09:43.768 } 00:09:43.768 }' 00:09:43.768 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:43.768 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:43.768 pt2' 00:09:43.768 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:43.768 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:43.768 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:44.029 "name": "pt1", 00:09:44.029 "aliases": [ 00:09:44.029 "00000000-0000-0000-0000-000000000001" 00:09:44.029 ], 00:09:44.029 "product_name": "passthru", 00:09:44.029 "block_size": 512, 00:09:44.029 "num_blocks": 65536, 00:09:44.029 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:44.029 "assigned_rate_limits": { 00:09:44.029 "rw_ios_per_sec": 0, 00:09:44.029 "rw_mbytes_per_sec": 0, 00:09:44.029 "r_mbytes_per_sec": 0, 00:09:44.029 "w_mbytes_per_sec": 0 00:09:44.029 }, 00:09:44.029 "claimed": true, 00:09:44.029 "claim_type": "exclusive_write", 00:09:44.029 "zoned": false, 00:09:44.029 "supported_io_types": { 00:09:44.029 "read": true, 00:09:44.029 "write": true, 00:09:44.029 "unmap": true, 00:09:44.029 "write_zeroes": true, 00:09:44.029 "flush": true, 00:09:44.029 "reset": true, 00:09:44.029 "compare": false, 00:09:44.029 "compare_and_write": false, 00:09:44.029 "abort": true, 00:09:44.029 "nvme_admin": false, 00:09:44.029 "nvme_io": false 00:09:44.029 }, 00:09:44.029 "memory_domains": [ 00:09:44.029 { 00:09:44.029 "dma_device_id": "system", 00:09:44.029 "dma_device_type": 1 00:09:44.029 }, 00:09:44.029 { 00:09:44.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.029 "dma_device_type": 2 00:09:44.029 } 00:09:44.029 ], 00:09:44.029 "driver_specific": { 00:09:44.029 "passthru": { 00:09:44.029 "name": "pt1", 00:09:44.029 "base_bdev_name": "malloc1" 00:09:44.029 } 00:09:44.029 } 00:09:44.029 }' 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:44.029 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:44.290 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:44.290 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:44.290 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:44.290 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:44.290 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:44.550 "name": "pt2", 00:09:44.550 "aliases": [ 00:09:44.550 "00000000-0000-0000-0000-000000000002" 00:09:44.550 ], 00:09:44.550 "product_name": "passthru", 00:09:44.550 "block_size": 512, 00:09:44.550 "num_blocks": 65536, 00:09:44.550 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:44.550 "assigned_rate_limits": { 00:09:44.550 "rw_ios_per_sec": 0, 00:09:44.550 "rw_mbytes_per_sec": 0, 00:09:44.550 "r_mbytes_per_sec": 0, 00:09:44.550 "w_mbytes_per_sec": 0 00:09:44.550 }, 00:09:44.550 "claimed": true, 00:09:44.550 "claim_type": "exclusive_write", 00:09:44.550 "zoned": false, 00:09:44.550 "supported_io_types": { 00:09:44.550 "read": true, 00:09:44.550 "write": true, 00:09:44.550 "unmap": true, 00:09:44.550 "write_zeroes": true, 00:09:44.550 "flush": true, 00:09:44.550 "reset": true, 00:09:44.550 "compare": false, 00:09:44.550 "compare_and_write": false, 00:09:44.550 "abort": true, 00:09:44.550 "nvme_admin": false, 00:09:44.550 "nvme_io": false 00:09:44.550 }, 00:09:44.550 "memory_domains": [ 00:09:44.550 { 00:09:44.550 "dma_device_id": "system", 00:09:44.550 "dma_device_type": 1 00:09:44.550 }, 00:09:44.550 { 00:09:44.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.550 "dma_device_type": 2 00:09:44.550 } 00:09:44.550 ], 00:09:44.550 "driver_specific": { 00:09:44.550 "passthru": { 00:09:44.550 "name": "pt2", 00:09:44.550 "base_bdev_name": "malloc2" 00:09:44.550 } 00:09:44.550 } 00:09:44.550 }' 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:44.550 13:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:44.550 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:44.811 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:44.812 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:44.812 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:44.812 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:44.812 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:44.812 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:09:45.073 [2024-06-10 13:37:59.326082] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 99ed2251-05fe-4a09-a761-92664dabc1cf '!=' 99ed2251-05fe-4a09-a761-92664dabc1cf ']' 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1490384 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1490384 ']' 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1490384 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1490384 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1490384' 00:09:45.073 killing process with pid 1490384 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1490384 00:09:45.073 [2024-06-10 13:37:59.396828] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:45.073 [2024-06-10 13:37:59.396868] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:45.073 [2024-06-10 13:37:59.396900] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:45.073 [2024-06-10 13:37:59.396906] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb7c50 name raid_bdev1, state offline 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1490384 00:09:45.073 [2024-06-10 13:37:59.406551] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:09:45.073 00:09:45.073 real 0m9.383s 00:09:45.073 user 0m17.123s 00:09:45.073 sys 0m1.421s 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:45.073 13:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:45.073 ************************************ 00:09:45.073 END TEST raid_superblock_test 00:09:45.073 ************************************ 00:09:45.333 13:37:59 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:09:45.334 13:37:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:45.334 13:37:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:45.334 13:37:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:45.334 ************************************ 00:09:45.334 START TEST raid_read_error_test 00:09:45.334 ************************************ 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 read 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Y8XKMHULvN 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1492470 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1492470 /var/tmp/spdk-raid.sock 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1492470 ']' 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:45.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:45.334 13:37:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:45.334 [2024-06-10 13:37:59.677246] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:45.334 [2024-06-10 13:37:59.677298] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1492470 ] 00:09:45.334 [2024-06-10 13:37:59.768013] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:45.595 [2024-06-10 13:37:59.838780] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.595 [2024-06-10 13:37:59.880212] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:45.595 [2024-06-10 13:37:59.880236] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:46.167 13:38:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:46.167 13:38:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:09:46.167 13:38:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:46.167 13:38:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:46.427 BaseBdev1_malloc 00:09:46.427 13:38:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:46.688 true 00:09:46.688 13:38:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:46.688 [2024-06-10 13:38:01.116050] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:46.688 [2024-06-10 13:38:01.116084] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:46.688 [2024-06-10 13:38:01.116096] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2936c90 00:09:46.688 [2024-06-10 13:38:01.116103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:46.688 [2024-06-10 13:38:01.117572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:46.688 [2024-06-10 13:38:01.117593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:46.688 BaseBdev1 00:09:46.688 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:46.688 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:46.948 BaseBdev2_malloc 00:09:46.948 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:47.208 true 00:09:47.208 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:47.470 [2024-06-10 13:38:01.719647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:47.470 [2024-06-10 13:38:01.719676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:47.470 [2024-06-10 13:38:01.719688] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x293b400 00:09:47.470 [2024-06-10 13:38:01.719695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:47.470 [2024-06-10 13:38:01.720967] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:47.470 [2024-06-10 13:38:01.720986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:47.470 BaseBdev2 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:47.470 [2024-06-10 13:38:01.920176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:47.470 [2024-06-10 13:38:01.921256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:47.470 [2024-06-10 13:38:01.921406] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x293ae20 00:09:47.470 [2024-06-10 13:38:01.921415] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:47.470 [2024-06-10 13:38:01.921571] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x293f4b0 00:09:47.470 [2024-06-10 13:38:01.921690] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x293ae20 00:09:47.470 [2024-06-10 13:38:01.921696] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x293ae20 00:09:47.470 [2024-06-10 13:38:01.921776] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:47.470 13:38:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:47.733 13:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:47.733 "name": "raid_bdev1", 00:09:47.733 "uuid": "00baf272-5976-48a9-9888-270b0092c011", 00:09:47.733 "strip_size_kb": 64, 00:09:47.733 "state": "online", 00:09:47.733 "raid_level": "raid0", 00:09:47.733 "superblock": true, 00:09:47.733 "num_base_bdevs": 2, 00:09:47.733 "num_base_bdevs_discovered": 2, 00:09:47.733 "num_base_bdevs_operational": 2, 00:09:47.733 "base_bdevs_list": [ 00:09:47.733 { 00:09:47.733 "name": "BaseBdev1", 00:09:47.733 "uuid": "6e74260d-81d5-5c55-ab9e-c6b289e8635b", 00:09:47.733 "is_configured": true, 00:09:47.733 "data_offset": 2048, 00:09:47.733 "data_size": 63488 00:09:47.733 }, 00:09:47.733 { 00:09:47.733 "name": "BaseBdev2", 00:09:47.733 "uuid": "1162fe1b-022f-55cb-8748-41be3b159018", 00:09:47.733 "is_configured": true, 00:09:47.733 "data_offset": 2048, 00:09:47.733 "data_size": 63488 00:09:47.733 } 00:09:47.733 ] 00:09:47.733 }' 00:09:47.733 13:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:47.733 13:38:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:48.304 13:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:48.304 13:38:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:48.304 [2024-06-10 13:38:02.762509] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278c970 00:09:49.245 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.506 13:38:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:49.767 13:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:49.767 "name": "raid_bdev1", 00:09:49.767 "uuid": "00baf272-5976-48a9-9888-270b0092c011", 00:09:49.767 "strip_size_kb": 64, 00:09:49.767 "state": "online", 00:09:49.767 "raid_level": "raid0", 00:09:49.767 "superblock": true, 00:09:49.767 "num_base_bdevs": 2, 00:09:49.767 "num_base_bdevs_discovered": 2, 00:09:49.767 "num_base_bdevs_operational": 2, 00:09:49.767 "base_bdevs_list": [ 00:09:49.767 { 00:09:49.767 "name": "BaseBdev1", 00:09:49.767 "uuid": "6e74260d-81d5-5c55-ab9e-c6b289e8635b", 00:09:49.767 "is_configured": true, 00:09:49.767 "data_offset": 2048, 00:09:49.767 "data_size": 63488 00:09:49.767 }, 00:09:49.767 { 00:09:49.767 "name": "BaseBdev2", 00:09:49.767 "uuid": "1162fe1b-022f-55cb-8748-41be3b159018", 00:09:49.767 "is_configured": true, 00:09:49.767 "data_offset": 2048, 00:09:49.767 "data_size": 63488 00:09:49.767 } 00:09:49.767 ] 00:09:49.767 }' 00:09:49.767 13:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:49.767 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.338 13:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:50.599 [2024-06-10 13:38:04.822348] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:50.599 [2024-06-10 13:38:04.822376] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:50.599 [2024-06-10 13:38:04.825169] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:50.599 [2024-06-10 13:38:04.825193] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:50.599 [2024-06-10 13:38:04.825212] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:50.599 [2024-06-10 13:38:04.825218] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x293ae20 name raid_bdev1, state offline 00:09:50.599 0 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1492470 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1492470 ']' 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1492470 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1492470 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1492470' 00:09:50.599 killing process with pid 1492470 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1492470 00:09:50.599 [2024-06-10 13:38:04.897596] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:50.599 13:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1492470 00:09:50.599 [2024-06-10 13:38:04.902820] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Y8XKMHULvN 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:09:50.599 00:09:50.599 real 0m5.431s 00:09:50.599 user 0m8.590s 00:09:50.599 sys 0m0.752s 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:50.599 13:38:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.599 ************************************ 00:09:50.599 END TEST raid_read_error_test 00:09:50.599 ************************************ 00:09:50.861 13:38:05 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:09:50.861 13:38:05 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:50.861 13:38:05 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:50.861 13:38:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:50.861 ************************************ 00:09:50.861 START TEST raid_write_error_test 00:09:50.861 ************************************ 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 write 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.npEcxEhybR 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1493622 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1493622 /var/tmp/spdk-raid.sock 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1493622 ']' 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:50.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:50.861 13:38:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.861 [2024-06-10 13:38:05.184280] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:50.861 [2024-06-10 13:38:05.184333] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1493622 ] 00:09:50.861 [2024-06-10 13:38:05.277048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:51.122 [2024-06-10 13:38:05.353756] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:51.122 [2024-06-10 13:38:05.406389] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:51.122 [2024-06-10 13:38:05.406413] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:51.694 13:38:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:51.694 13:38:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:09:51.694 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:51.694 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:51.955 BaseBdev1_malloc 00:09:51.955 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:52.215 true 00:09:52.215 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:52.215 [2024-06-10 13:38:06.630477] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:52.215 [2024-06-10 13:38:06.630508] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:52.215 [2024-06-10 13:38:06.630520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda3c90 00:09:52.215 [2024-06-10 13:38:06.630526] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:52.215 [2024-06-10 13:38:06.631952] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:52.215 [2024-06-10 13:38:06.631972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:52.215 BaseBdev1 00:09:52.215 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:52.215 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:52.476 BaseBdev2_malloc 00:09:52.476 13:38:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:52.737 true 00:09:52.737 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:52.997 [2024-06-10 13:38:07.213851] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:52.997 [2024-06-10 13:38:07.213881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:52.997 [2024-06-10 13:38:07.213892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda8400 00:09:52.997 [2024-06-10 13:38:07.213898] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:52.997 [2024-06-10 13:38:07.215134] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:52.997 [2024-06-10 13:38:07.215153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:52.997 BaseBdev2 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:52.997 [2024-06-10 13:38:07.418388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:52.997 [2024-06-10 13:38:07.419434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:52.997 [2024-06-10 13:38:07.419582] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xda7e20 00:09:52.997 [2024-06-10 13:38:07.419590] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:52.997 [2024-06-10 13:38:07.419746] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdac4b0 00:09:52.997 [2024-06-10 13:38:07.419862] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xda7e20 00:09:52.997 [2024-06-10 13:38:07.419868] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xda7e20 00:09:52.997 [2024-06-10 13:38:07.419946] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:52.997 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:53.258 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:53.258 "name": "raid_bdev1", 00:09:53.258 "uuid": "5dd28655-60db-44a7-a9f0-3d67aad0d30a", 00:09:53.258 "strip_size_kb": 64, 00:09:53.258 "state": "online", 00:09:53.258 "raid_level": "raid0", 00:09:53.258 "superblock": true, 00:09:53.258 "num_base_bdevs": 2, 00:09:53.258 "num_base_bdevs_discovered": 2, 00:09:53.258 "num_base_bdevs_operational": 2, 00:09:53.258 "base_bdevs_list": [ 00:09:53.258 { 00:09:53.258 "name": "BaseBdev1", 00:09:53.258 "uuid": "5e293785-b132-5062-8f61-f6661e86fa08", 00:09:53.258 "is_configured": true, 00:09:53.258 "data_offset": 2048, 00:09:53.258 "data_size": 63488 00:09:53.258 }, 00:09:53.258 { 00:09:53.258 "name": "BaseBdev2", 00:09:53.258 "uuid": "9685a688-91cf-5be0-a571-16fdc1a784b1", 00:09:53.258 "is_configured": true, 00:09:53.258 "data_offset": 2048, 00:09:53.258 "data_size": 63488 00:09:53.258 } 00:09:53.258 ] 00:09:53.258 }' 00:09:53.258 13:38:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:53.258 13:38:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:53.829 13:38:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:53.829 13:38:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:53.829 [2024-06-10 13:38:08.236659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbf9970 00:09:54.772 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:55.033 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:55.294 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:55.294 "name": "raid_bdev1", 00:09:55.294 "uuid": "5dd28655-60db-44a7-a9f0-3d67aad0d30a", 00:09:55.294 "strip_size_kb": 64, 00:09:55.294 "state": "online", 00:09:55.294 "raid_level": "raid0", 00:09:55.294 "superblock": true, 00:09:55.294 "num_base_bdevs": 2, 00:09:55.294 "num_base_bdevs_discovered": 2, 00:09:55.294 "num_base_bdevs_operational": 2, 00:09:55.294 "base_bdevs_list": [ 00:09:55.294 { 00:09:55.294 "name": "BaseBdev1", 00:09:55.294 "uuid": "5e293785-b132-5062-8f61-f6661e86fa08", 00:09:55.294 "is_configured": true, 00:09:55.294 "data_offset": 2048, 00:09:55.294 "data_size": 63488 00:09:55.294 }, 00:09:55.294 { 00:09:55.294 "name": "BaseBdev2", 00:09:55.294 "uuid": "9685a688-91cf-5be0-a571-16fdc1a784b1", 00:09:55.294 "is_configured": true, 00:09:55.294 "data_offset": 2048, 00:09:55.294 "data_size": 63488 00:09:55.294 } 00:09:55.294 ] 00:09:55.294 }' 00:09:55.294 13:38:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:55.294 13:38:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:55.866 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:56.127 [2024-06-10 13:38:10.351576] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:56.127 [2024-06-10 13:38:10.351604] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:56.127 [2024-06-10 13:38:10.354412] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:56.127 [2024-06-10 13:38:10.354438] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:56.127 [2024-06-10 13:38:10.354456] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:56.127 [2024-06-10 13:38:10.354463] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda7e20 name raid_bdev1, state offline 00:09:56.127 0 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1493622 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1493622 ']' 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1493622 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1493622 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1493622' 00:09:56.127 killing process with pid 1493622 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1493622 00:09:56.127 [2024-06-10 13:38:10.422035] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1493622 00:09:56.127 [2024-06-10 13:38:10.427643] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.npEcxEhybR 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:09:56.127 00:09:56.127 real 0m5.454s 00:09:56.127 user 0m8.555s 00:09:56.127 sys 0m0.775s 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:56.127 13:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:56.127 ************************************ 00:09:56.127 END TEST raid_write_error_test 00:09:56.127 ************************************ 00:09:56.388 13:38:10 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:56.388 13:38:10 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:09:56.388 13:38:10 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:56.388 13:38:10 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:56.388 13:38:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:56.388 ************************************ 00:09:56.388 START TEST raid_state_function_test 00:09:56.388 ************************************ 00:09:56.388 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 false 00:09:56.388 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:09:56.388 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:56.388 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:56.388 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1494968 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1494968' 00:09:56.389 Process raid pid: 1494968 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1494968 /var/tmp/spdk-raid.sock 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1494968 ']' 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:56.389 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:56.389 13:38:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:56.389 [2024-06-10 13:38:10.708941] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:09:56.389 [2024-06-10 13:38:10.708999] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:56.389 [2024-06-10 13:38:10.802814] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:56.650 [2024-06-10 13:38:10.872007] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:09:56.650 [2024-06-10 13:38:10.915248] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:56.650 [2024-06-10 13:38:10.915269] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:57.221 13:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:57.221 13:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:09:57.221 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:57.482 [2024-06-10 13:38:11.751206] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:57.482 [2024-06-10 13:38:11.751236] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:57.482 [2024-06-10 13:38:11.751243] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:57.482 [2024-06-10 13:38:11.751249] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:57.482 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:57.743 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:57.743 "name": "Existed_Raid", 00:09:57.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:57.743 "strip_size_kb": 64, 00:09:57.743 "state": "configuring", 00:09:57.743 "raid_level": "concat", 00:09:57.743 "superblock": false, 00:09:57.743 "num_base_bdevs": 2, 00:09:57.743 "num_base_bdevs_discovered": 0, 00:09:57.743 "num_base_bdevs_operational": 2, 00:09:57.743 "base_bdevs_list": [ 00:09:57.743 { 00:09:57.743 "name": "BaseBdev1", 00:09:57.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:57.743 "is_configured": false, 00:09:57.743 "data_offset": 0, 00:09:57.743 "data_size": 0 00:09:57.743 }, 00:09:57.743 { 00:09:57.743 "name": "BaseBdev2", 00:09:57.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:57.743 "is_configured": false, 00:09:57.743 "data_offset": 0, 00:09:57.743 "data_size": 0 00:09:57.743 } 00:09:57.743 ] 00:09:57.743 }' 00:09:57.743 13:38:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:57.743 13:38:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:58.314 13:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:58.314 [2024-06-10 13:38:12.725557] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:58.314 [2024-06-10 13:38:12.725578] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd6720 name Existed_Raid, state configuring 00:09:58.314 13:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:58.576 [2024-06-10 13:38:12.930092] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:58.576 [2024-06-10 13:38:12.930112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:58.576 [2024-06-10 13:38:12.930118] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:58.576 [2024-06-10 13:38:12.930124] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:58.576 13:38:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:58.836 [2024-06-10 13:38:13.129326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:58.836 BaseBdev1 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:58.836 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:59.097 [ 00:09:59.097 { 00:09:59.097 "name": "BaseBdev1", 00:09:59.097 "aliases": [ 00:09:59.097 "1481de32-a4a6-42d8-b1c6-96633415beef" 00:09:59.097 ], 00:09:59.097 "product_name": "Malloc disk", 00:09:59.097 "block_size": 512, 00:09:59.097 "num_blocks": 65536, 00:09:59.097 "uuid": "1481de32-a4a6-42d8-b1c6-96633415beef", 00:09:59.097 "assigned_rate_limits": { 00:09:59.097 "rw_ios_per_sec": 0, 00:09:59.097 "rw_mbytes_per_sec": 0, 00:09:59.097 "r_mbytes_per_sec": 0, 00:09:59.097 "w_mbytes_per_sec": 0 00:09:59.097 }, 00:09:59.097 "claimed": true, 00:09:59.097 "claim_type": "exclusive_write", 00:09:59.097 "zoned": false, 00:09:59.097 "supported_io_types": { 00:09:59.097 "read": true, 00:09:59.097 "write": true, 00:09:59.097 "unmap": true, 00:09:59.097 "write_zeroes": true, 00:09:59.097 "flush": true, 00:09:59.097 "reset": true, 00:09:59.097 "compare": false, 00:09:59.097 "compare_and_write": false, 00:09:59.097 "abort": true, 00:09:59.097 "nvme_admin": false, 00:09:59.097 "nvme_io": false 00:09:59.097 }, 00:09:59.097 "memory_domains": [ 00:09:59.097 { 00:09:59.097 "dma_device_id": "system", 00:09:59.097 "dma_device_type": 1 00:09:59.097 }, 00:09:59.097 { 00:09:59.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:59.097 "dma_device_type": 2 00:09:59.097 } 00:09:59.097 ], 00:09:59.097 "driver_specific": {} 00:09:59.097 } 00:09:59.097 ] 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:59.097 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:59.098 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:59.098 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:59.358 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:59.358 "name": "Existed_Raid", 00:09:59.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:59.358 "strip_size_kb": 64, 00:09:59.358 "state": "configuring", 00:09:59.358 "raid_level": "concat", 00:09:59.358 "superblock": false, 00:09:59.358 "num_base_bdevs": 2, 00:09:59.358 "num_base_bdevs_discovered": 1, 00:09:59.358 "num_base_bdevs_operational": 2, 00:09:59.358 "base_bdevs_list": [ 00:09:59.358 { 00:09:59.358 "name": "BaseBdev1", 00:09:59.358 "uuid": "1481de32-a4a6-42d8-b1c6-96633415beef", 00:09:59.358 "is_configured": true, 00:09:59.358 "data_offset": 0, 00:09:59.358 "data_size": 65536 00:09:59.358 }, 00:09:59.358 { 00:09:59.358 "name": "BaseBdev2", 00:09:59.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:59.358 "is_configured": false, 00:09:59.358 "data_offset": 0, 00:09:59.358 "data_size": 0 00:09:59.358 } 00:09:59.358 ] 00:09:59.358 }' 00:09:59.358 13:38:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:59.358 13:38:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:59.928 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:00.188 [2024-06-10 13:38:14.492788] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:00.189 [2024-06-10 13:38:14.492818] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd6010 name Existed_Raid, state configuring 00:10:00.189 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:00.449 [2024-06-10 13:38:14.697343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:00.449 [2024-06-10 13:38:14.698574] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:00.449 [2024-06-10 13:38:14.698601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:00.449 "name": "Existed_Raid", 00:10:00.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:00.449 "strip_size_kb": 64, 00:10:00.449 "state": "configuring", 00:10:00.449 "raid_level": "concat", 00:10:00.449 "superblock": false, 00:10:00.449 "num_base_bdevs": 2, 00:10:00.449 "num_base_bdevs_discovered": 1, 00:10:00.449 "num_base_bdevs_operational": 2, 00:10:00.449 "base_bdevs_list": [ 00:10:00.449 { 00:10:00.449 "name": "BaseBdev1", 00:10:00.449 "uuid": "1481de32-a4a6-42d8-b1c6-96633415beef", 00:10:00.449 "is_configured": true, 00:10:00.449 "data_offset": 0, 00:10:00.449 "data_size": 65536 00:10:00.449 }, 00:10:00.449 { 00:10:00.449 "name": "BaseBdev2", 00:10:00.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:00.449 "is_configured": false, 00:10:00.449 "data_offset": 0, 00:10:00.449 "data_size": 0 00:10:00.449 } 00:10:00.449 ] 00:10:00.449 }' 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:00.449 13:38:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.021 13:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:01.283 [2024-06-10 13:38:15.632812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:01.283 [2024-06-10 13:38:15.632835] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfd6e00 00:10:01.283 [2024-06-10 13:38:15.632840] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:01.283 [2024-06-10 13:38:15.632994] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfcd6a0 00:10:01.283 [2024-06-10 13:38:15.633086] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfd6e00 00:10:01.283 [2024-06-10 13:38:15.633092] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfd6e00 00:10:01.283 [2024-06-10 13:38:15.633229] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:01.283 BaseBdev2 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:01.283 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:01.543 13:38:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:01.802 [ 00:10:01.802 { 00:10:01.802 "name": "BaseBdev2", 00:10:01.802 "aliases": [ 00:10:01.802 "0f14ec88-e772-46d7-84bb-e9ffff683122" 00:10:01.802 ], 00:10:01.803 "product_name": "Malloc disk", 00:10:01.803 "block_size": 512, 00:10:01.803 "num_blocks": 65536, 00:10:01.803 "uuid": "0f14ec88-e772-46d7-84bb-e9ffff683122", 00:10:01.803 "assigned_rate_limits": { 00:10:01.803 "rw_ios_per_sec": 0, 00:10:01.803 "rw_mbytes_per_sec": 0, 00:10:01.803 "r_mbytes_per_sec": 0, 00:10:01.803 "w_mbytes_per_sec": 0 00:10:01.803 }, 00:10:01.803 "claimed": true, 00:10:01.803 "claim_type": "exclusive_write", 00:10:01.803 "zoned": false, 00:10:01.803 "supported_io_types": { 00:10:01.803 "read": true, 00:10:01.803 "write": true, 00:10:01.803 "unmap": true, 00:10:01.803 "write_zeroes": true, 00:10:01.803 "flush": true, 00:10:01.803 "reset": true, 00:10:01.803 "compare": false, 00:10:01.803 "compare_and_write": false, 00:10:01.803 "abort": true, 00:10:01.803 "nvme_admin": false, 00:10:01.803 "nvme_io": false 00:10:01.803 }, 00:10:01.803 "memory_domains": [ 00:10:01.803 { 00:10:01.803 "dma_device_id": "system", 00:10:01.803 "dma_device_type": 1 00:10:01.803 }, 00:10:01.803 { 00:10:01.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:01.803 "dma_device_type": 2 00:10:01.803 } 00:10:01.803 ], 00:10:01.803 "driver_specific": {} 00:10:01.803 } 00:10:01.803 ] 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.803 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:02.063 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:02.063 "name": "Existed_Raid", 00:10:02.063 "uuid": "37fe9db6-3bb4-4e03-ac02-f9dff69ff29a", 00:10:02.063 "strip_size_kb": 64, 00:10:02.063 "state": "online", 00:10:02.063 "raid_level": "concat", 00:10:02.063 "superblock": false, 00:10:02.063 "num_base_bdevs": 2, 00:10:02.063 "num_base_bdevs_discovered": 2, 00:10:02.063 "num_base_bdevs_operational": 2, 00:10:02.063 "base_bdevs_list": [ 00:10:02.063 { 00:10:02.063 "name": "BaseBdev1", 00:10:02.063 "uuid": "1481de32-a4a6-42d8-b1c6-96633415beef", 00:10:02.063 "is_configured": true, 00:10:02.063 "data_offset": 0, 00:10:02.063 "data_size": 65536 00:10:02.063 }, 00:10:02.063 { 00:10:02.063 "name": "BaseBdev2", 00:10:02.063 "uuid": "0f14ec88-e772-46d7-84bb-e9ffff683122", 00:10:02.063 "is_configured": true, 00:10:02.063 "data_offset": 0, 00:10:02.063 "data_size": 65536 00:10:02.063 } 00:10:02.063 ] 00:10:02.063 }' 00:10:02.063 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:02.063 13:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:02.633 13:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:02.633 [2024-06-10 13:38:17.016533] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:02.633 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:02.633 "name": "Existed_Raid", 00:10:02.633 "aliases": [ 00:10:02.633 "37fe9db6-3bb4-4e03-ac02-f9dff69ff29a" 00:10:02.633 ], 00:10:02.633 "product_name": "Raid Volume", 00:10:02.633 "block_size": 512, 00:10:02.633 "num_blocks": 131072, 00:10:02.633 "uuid": "37fe9db6-3bb4-4e03-ac02-f9dff69ff29a", 00:10:02.633 "assigned_rate_limits": { 00:10:02.633 "rw_ios_per_sec": 0, 00:10:02.633 "rw_mbytes_per_sec": 0, 00:10:02.633 "r_mbytes_per_sec": 0, 00:10:02.633 "w_mbytes_per_sec": 0 00:10:02.633 }, 00:10:02.633 "claimed": false, 00:10:02.633 "zoned": false, 00:10:02.633 "supported_io_types": { 00:10:02.633 "read": true, 00:10:02.633 "write": true, 00:10:02.633 "unmap": true, 00:10:02.633 "write_zeroes": true, 00:10:02.633 "flush": true, 00:10:02.633 "reset": true, 00:10:02.633 "compare": false, 00:10:02.633 "compare_and_write": false, 00:10:02.633 "abort": false, 00:10:02.633 "nvme_admin": false, 00:10:02.633 "nvme_io": false 00:10:02.633 }, 00:10:02.633 "memory_domains": [ 00:10:02.633 { 00:10:02.633 "dma_device_id": "system", 00:10:02.633 "dma_device_type": 1 00:10:02.633 }, 00:10:02.633 { 00:10:02.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.633 "dma_device_type": 2 00:10:02.633 }, 00:10:02.633 { 00:10:02.633 "dma_device_id": "system", 00:10:02.633 "dma_device_type": 1 00:10:02.633 }, 00:10:02.633 { 00:10:02.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.633 "dma_device_type": 2 00:10:02.633 } 00:10:02.633 ], 00:10:02.633 "driver_specific": { 00:10:02.633 "raid": { 00:10:02.633 "uuid": "37fe9db6-3bb4-4e03-ac02-f9dff69ff29a", 00:10:02.634 "strip_size_kb": 64, 00:10:02.634 "state": "online", 00:10:02.634 "raid_level": "concat", 00:10:02.634 "superblock": false, 00:10:02.634 "num_base_bdevs": 2, 00:10:02.634 "num_base_bdevs_discovered": 2, 00:10:02.634 "num_base_bdevs_operational": 2, 00:10:02.634 "base_bdevs_list": [ 00:10:02.634 { 00:10:02.634 "name": "BaseBdev1", 00:10:02.634 "uuid": "1481de32-a4a6-42d8-b1c6-96633415beef", 00:10:02.634 "is_configured": true, 00:10:02.634 "data_offset": 0, 00:10:02.634 "data_size": 65536 00:10:02.634 }, 00:10:02.634 { 00:10:02.634 "name": "BaseBdev2", 00:10:02.634 "uuid": "0f14ec88-e772-46d7-84bb-e9ffff683122", 00:10:02.634 "is_configured": true, 00:10:02.634 "data_offset": 0, 00:10:02.634 "data_size": 65536 00:10:02.634 } 00:10:02.634 ] 00:10:02.634 } 00:10:02.634 } 00:10:02.634 }' 00:10:02.634 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:02.634 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:02.634 BaseBdev2' 00:10:02.634 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:02.634 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:02.634 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:02.894 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:02.894 "name": "BaseBdev1", 00:10:02.894 "aliases": [ 00:10:02.894 "1481de32-a4a6-42d8-b1c6-96633415beef" 00:10:02.894 ], 00:10:02.894 "product_name": "Malloc disk", 00:10:02.894 "block_size": 512, 00:10:02.894 "num_blocks": 65536, 00:10:02.894 "uuid": "1481de32-a4a6-42d8-b1c6-96633415beef", 00:10:02.894 "assigned_rate_limits": { 00:10:02.894 "rw_ios_per_sec": 0, 00:10:02.894 "rw_mbytes_per_sec": 0, 00:10:02.894 "r_mbytes_per_sec": 0, 00:10:02.894 "w_mbytes_per_sec": 0 00:10:02.894 }, 00:10:02.894 "claimed": true, 00:10:02.894 "claim_type": "exclusive_write", 00:10:02.894 "zoned": false, 00:10:02.894 "supported_io_types": { 00:10:02.894 "read": true, 00:10:02.894 "write": true, 00:10:02.894 "unmap": true, 00:10:02.894 "write_zeroes": true, 00:10:02.894 "flush": true, 00:10:02.894 "reset": true, 00:10:02.894 "compare": false, 00:10:02.894 "compare_and_write": false, 00:10:02.894 "abort": true, 00:10:02.894 "nvme_admin": false, 00:10:02.894 "nvme_io": false 00:10:02.894 }, 00:10:02.894 "memory_domains": [ 00:10:02.894 { 00:10:02.894 "dma_device_id": "system", 00:10:02.894 "dma_device_type": 1 00:10:02.894 }, 00:10:02.894 { 00:10:02.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.894 "dma_device_type": 2 00:10:02.894 } 00:10:02.894 ], 00:10:02.894 "driver_specific": {} 00:10:02.894 }' 00:10:02.894 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:02.894 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:03.154 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:03.154 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:03.154 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:03.154 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:03.155 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:03.415 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:03.415 "name": "BaseBdev2", 00:10:03.415 "aliases": [ 00:10:03.415 "0f14ec88-e772-46d7-84bb-e9ffff683122" 00:10:03.415 ], 00:10:03.415 "product_name": "Malloc disk", 00:10:03.415 "block_size": 512, 00:10:03.415 "num_blocks": 65536, 00:10:03.415 "uuid": "0f14ec88-e772-46d7-84bb-e9ffff683122", 00:10:03.415 "assigned_rate_limits": { 00:10:03.415 "rw_ios_per_sec": 0, 00:10:03.415 "rw_mbytes_per_sec": 0, 00:10:03.415 "r_mbytes_per_sec": 0, 00:10:03.415 "w_mbytes_per_sec": 0 00:10:03.415 }, 00:10:03.415 "claimed": true, 00:10:03.415 "claim_type": "exclusive_write", 00:10:03.415 "zoned": false, 00:10:03.415 "supported_io_types": { 00:10:03.415 "read": true, 00:10:03.415 "write": true, 00:10:03.415 "unmap": true, 00:10:03.415 "write_zeroes": true, 00:10:03.415 "flush": true, 00:10:03.415 "reset": true, 00:10:03.415 "compare": false, 00:10:03.415 "compare_and_write": false, 00:10:03.415 "abort": true, 00:10:03.415 "nvme_admin": false, 00:10:03.415 "nvme_io": false 00:10:03.415 }, 00:10:03.415 "memory_domains": [ 00:10:03.415 { 00:10:03.415 "dma_device_id": "system", 00:10:03.415 "dma_device_type": 1 00:10:03.415 }, 00:10:03.415 { 00:10:03.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:03.415 "dma_device_type": 2 00:10:03.415 } 00:10:03.415 ], 00:10:03.415 "driver_specific": {} 00:10:03.415 }' 00:10:03.415 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:03.415 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:03.676 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:03.676 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:03.676 13:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:03.676 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:03.676 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:03.676 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:03.676 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:03.676 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:03.676 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:03.937 [2024-06-10 13:38:18.363796] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:03.937 [2024-06-10 13:38:18.363813] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:03.937 [2024-06-10 13:38:18.363846] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:03.937 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:04.197 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:04.197 "name": "Existed_Raid", 00:10:04.197 "uuid": "37fe9db6-3bb4-4e03-ac02-f9dff69ff29a", 00:10:04.197 "strip_size_kb": 64, 00:10:04.197 "state": "offline", 00:10:04.197 "raid_level": "concat", 00:10:04.197 "superblock": false, 00:10:04.197 "num_base_bdevs": 2, 00:10:04.197 "num_base_bdevs_discovered": 1, 00:10:04.197 "num_base_bdevs_operational": 1, 00:10:04.197 "base_bdevs_list": [ 00:10:04.197 { 00:10:04.197 "name": null, 00:10:04.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:04.197 "is_configured": false, 00:10:04.197 "data_offset": 0, 00:10:04.197 "data_size": 65536 00:10:04.197 }, 00:10:04.197 { 00:10:04.197 "name": "BaseBdev2", 00:10:04.197 "uuid": "0f14ec88-e772-46d7-84bb-e9ffff683122", 00:10:04.197 "is_configured": true, 00:10:04.197 "data_offset": 0, 00:10:04.197 "data_size": 65536 00:10:04.197 } 00:10:04.197 ] 00:10:04.197 }' 00:10:04.197 13:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:04.197 13:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:04.765 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:04.765 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:04.765 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:04.765 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:05.026 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:05.026 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:05.026 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:05.314 [2024-06-10 13:38:19.550821] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:05.314 [2024-06-10 13:38:19.550854] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfd6e00 name Existed_Raid, state offline 00:10:05.314 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:05.314 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:05.314 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:05.314 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1494968 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1494968 ']' 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1494968 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1494968 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1494968' 00:10:05.619 killing process with pid 1494968 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1494968 00:10:05.619 [2024-06-10 13:38:19.843884] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1494968 00:10:05.619 [2024-06-10 13:38:19.844496] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:05.619 00:10:05.619 real 0m9.320s 00:10:05.619 user 0m16.956s 00:10:05.619 sys 0m1.411s 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:05.619 13:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:05.619 ************************************ 00:10:05.619 END TEST raid_state_function_test 00:10:05.619 ************************************ 00:10:05.619 13:38:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:10:05.619 13:38:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:05.619 13:38:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:05.619 13:38:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:05.619 ************************************ 00:10:05.619 START TEST raid_state_function_test_sb 00:10:05.619 ************************************ 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 true 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1496860 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1496860' 00:10:05.619 Process raid pid: 1496860 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1496860 /var/tmp/spdk-raid.sock 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1496860 ']' 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:05.619 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:05.619 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:05.886 [2024-06-10 13:38:20.103661] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:05.886 [2024-06-10 13:38:20.103710] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:05.886 [2024-06-10 13:38:20.193234] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.886 [2024-06-10 13:38:20.259509] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.886 [2024-06-10 13:38:20.304260] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:05.886 [2024-06-10 13:38:20.304283] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:06.456 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:06.456 13:38:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:10:06.456 13:38:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:06.716 [2024-06-10 13:38:21.043910] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:06.716 [2024-06-10 13:38:21.043941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:06.716 [2024-06-10 13:38:21.043948] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:06.716 [2024-06-10 13:38:21.043954] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:06.716 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:06.717 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:06.717 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:06.977 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:06.977 "name": "Existed_Raid", 00:10:06.977 "uuid": "9240a61b-7496-4523-adc4-a969a736d90e", 00:10:06.977 "strip_size_kb": 64, 00:10:06.977 "state": "configuring", 00:10:06.977 "raid_level": "concat", 00:10:06.977 "superblock": true, 00:10:06.977 "num_base_bdevs": 2, 00:10:06.977 "num_base_bdevs_discovered": 0, 00:10:06.977 "num_base_bdevs_operational": 2, 00:10:06.977 "base_bdevs_list": [ 00:10:06.977 { 00:10:06.977 "name": "BaseBdev1", 00:10:06.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:06.977 "is_configured": false, 00:10:06.977 "data_offset": 0, 00:10:06.977 "data_size": 0 00:10:06.977 }, 00:10:06.977 { 00:10:06.977 "name": "BaseBdev2", 00:10:06.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:06.977 "is_configured": false, 00:10:06.977 "data_offset": 0, 00:10:06.977 "data_size": 0 00:10:06.977 } 00:10:06.977 ] 00:10:06.977 }' 00:10:06.977 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:06.977 13:38:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:07.237 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:07.497 [2024-06-10 13:38:21.845854] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:07.497 [2024-06-10 13:38:21.845871] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb09720 name Existed_Raid, state configuring 00:10:07.497 13:38:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:07.758 [2024-06-10 13:38:22.002271] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:07.758 [2024-06-10 13:38:22.002287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:07.758 [2024-06-10 13:38:22.002292] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:07.758 [2024-06-10 13:38:22.002298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:07.758 [2024-06-10 13:38:22.153467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:07.758 BaseBdev1 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:07.758 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:08.018 [ 00:10:08.018 { 00:10:08.018 "name": "BaseBdev1", 00:10:08.018 "aliases": [ 00:10:08.018 "d6b73ebb-b738-4981-9308-88d0d0a5a778" 00:10:08.018 ], 00:10:08.018 "product_name": "Malloc disk", 00:10:08.018 "block_size": 512, 00:10:08.018 "num_blocks": 65536, 00:10:08.018 "uuid": "d6b73ebb-b738-4981-9308-88d0d0a5a778", 00:10:08.018 "assigned_rate_limits": { 00:10:08.018 "rw_ios_per_sec": 0, 00:10:08.018 "rw_mbytes_per_sec": 0, 00:10:08.018 "r_mbytes_per_sec": 0, 00:10:08.018 "w_mbytes_per_sec": 0 00:10:08.018 }, 00:10:08.018 "claimed": true, 00:10:08.018 "claim_type": "exclusive_write", 00:10:08.018 "zoned": false, 00:10:08.018 "supported_io_types": { 00:10:08.018 "read": true, 00:10:08.018 "write": true, 00:10:08.018 "unmap": true, 00:10:08.018 "write_zeroes": true, 00:10:08.018 "flush": true, 00:10:08.018 "reset": true, 00:10:08.018 "compare": false, 00:10:08.018 "compare_and_write": false, 00:10:08.018 "abort": true, 00:10:08.018 "nvme_admin": false, 00:10:08.018 "nvme_io": false 00:10:08.018 }, 00:10:08.018 "memory_domains": [ 00:10:08.018 { 00:10:08.018 "dma_device_id": "system", 00:10:08.018 "dma_device_type": 1 00:10:08.018 }, 00:10:08.018 { 00:10:08.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:08.018 "dma_device_type": 2 00:10:08.018 } 00:10:08.018 ], 00:10:08.018 "driver_specific": {} 00:10:08.018 } 00:10:08.018 ] 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:08.018 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:08.277 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:08.277 "name": "Existed_Raid", 00:10:08.277 "uuid": "edeed66c-4299-42d3-a139-e6b8daf81e1f", 00:10:08.277 "strip_size_kb": 64, 00:10:08.277 "state": "configuring", 00:10:08.277 "raid_level": "concat", 00:10:08.277 "superblock": true, 00:10:08.277 "num_base_bdevs": 2, 00:10:08.277 "num_base_bdevs_discovered": 1, 00:10:08.277 "num_base_bdevs_operational": 2, 00:10:08.277 "base_bdevs_list": [ 00:10:08.277 { 00:10:08.277 "name": "BaseBdev1", 00:10:08.277 "uuid": "d6b73ebb-b738-4981-9308-88d0d0a5a778", 00:10:08.277 "is_configured": true, 00:10:08.277 "data_offset": 2048, 00:10:08.277 "data_size": 63488 00:10:08.277 }, 00:10:08.277 { 00:10:08.277 "name": "BaseBdev2", 00:10:08.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:08.277 "is_configured": false, 00:10:08.277 "data_offset": 0, 00:10:08.277 "data_size": 0 00:10:08.277 } 00:10:08.277 ] 00:10:08.277 }' 00:10:08.277 13:38:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:08.277 13:38:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:08.848 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:08.848 [2024-06-10 13:38:23.244218] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:08.848 [2024-06-10 13:38:23.244245] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb09010 name Existed_Raid, state configuring 00:10:08.848 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:09.108 [2024-06-10 13:38:23.388619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:09.108 [2024-06-10 13:38:23.389840] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:09.108 [2024-06-10 13:38:23.389865] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:09.108 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:09.109 "name": "Existed_Raid", 00:10:09.109 "uuid": "291402c4-d871-4b60-927b-c0b9bbb9ee18", 00:10:09.109 "strip_size_kb": 64, 00:10:09.109 "state": "configuring", 00:10:09.109 "raid_level": "concat", 00:10:09.109 "superblock": true, 00:10:09.109 "num_base_bdevs": 2, 00:10:09.109 "num_base_bdevs_discovered": 1, 00:10:09.109 "num_base_bdevs_operational": 2, 00:10:09.109 "base_bdevs_list": [ 00:10:09.109 { 00:10:09.109 "name": "BaseBdev1", 00:10:09.109 "uuid": "d6b73ebb-b738-4981-9308-88d0d0a5a778", 00:10:09.109 "is_configured": true, 00:10:09.109 "data_offset": 2048, 00:10:09.109 "data_size": 63488 00:10:09.109 }, 00:10:09.109 { 00:10:09.109 "name": "BaseBdev2", 00:10:09.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.109 "is_configured": false, 00:10:09.109 "data_offset": 0, 00:10:09.109 "data_size": 0 00:10:09.109 } 00:10:09.109 ] 00:10:09.109 }' 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:09.109 13:38:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:09.679 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:09.938 [2024-06-10 13:38:24.324034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:09.938 [2024-06-10 13:38:24.324143] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb09e00 00:10:09.938 [2024-06-10 13:38:24.324152] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:09.938 [2024-06-10 13:38:24.324309] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb0b110 00:10:09.938 [2024-06-10 13:38:24.324398] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb09e00 00:10:09.938 [2024-06-10 13:38:24.324404] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb09e00 00:10:09.938 [2024-06-10 13:38:24.324478] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:09.938 BaseBdev2 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:09.938 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:10.198 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:10.458 [ 00:10:10.458 { 00:10:10.458 "name": "BaseBdev2", 00:10:10.458 "aliases": [ 00:10:10.458 "e34761be-6102-4fc8-8f38-820b4ed93f7c" 00:10:10.458 ], 00:10:10.458 "product_name": "Malloc disk", 00:10:10.458 "block_size": 512, 00:10:10.458 "num_blocks": 65536, 00:10:10.458 "uuid": "e34761be-6102-4fc8-8f38-820b4ed93f7c", 00:10:10.458 "assigned_rate_limits": { 00:10:10.458 "rw_ios_per_sec": 0, 00:10:10.458 "rw_mbytes_per_sec": 0, 00:10:10.458 "r_mbytes_per_sec": 0, 00:10:10.458 "w_mbytes_per_sec": 0 00:10:10.458 }, 00:10:10.458 "claimed": true, 00:10:10.458 "claim_type": "exclusive_write", 00:10:10.458 "zoned": false, 00:10:10.458 "supported_io_types": { 00:10:10.458 "read": true, 00:10:10.458 "write": true, 00:10:10.458 "unmap": true, 00:10:10.459 "write_zeroes": true, 00:10:10.459 "flush": true, 00:10:10.459 "reset": true, 00:10:10.459 "compare": false, 00:10:10.459 "compare_and_write": false, 00:10:10.459 "abort": true, 00:10:10.459 "nvme_admin": false, 00:10:10.459 "nvme_io": false 00:10:10.459 }, 00:10:10.459 "memory_domains": [ 00:10:10.459 { 00:10:10.459 "dma_device_id": "system", 00:10:10.459 "dma_device_type": 1 00:10:10.459 }, 00:10:10.459 { 00:10:10.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:10.459 "dma_device_type": 2 00:10:10.459 } 00:10:10.459 ], 00:10:10.459 "driver_specific": {} 00:10:10.459 } 00:10:10.459 ] 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:10.459 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:10.719 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:10.719 "name": "Existed_Raid", 00:10:10.719 "uuid": "291402c4-d871-4b60-927b-c0b9bbb9ee18", 00:10:10.719 "strip_size_kb": 64, 00:10:10.719 "state": "online", 00:10:10.719 "raid_level": "concat", 00:10:10.719 "superblock": true, 00:10:10.719 "num_base_bdevs": 2, 00:10:10.719 "num_base_bdevs_discovered": 2, 00:10:10.719 "num_base_bdevs_operational": 2, 00:10:10.719 "base_bdevs_list": [ 00:10:10.719 { 00:10:10.719 "name": "BaseBdev1", 00:10:10.719 "uuid": "d6b73ebb-b738-4981-9308-88d0d0a5a778", 00:10:10.719 "is_configured": true, 00:10:10.719 "data_offset": 2048, 00:10:10.719 "data_size": 63488 00:10:10.719 }, 00:10:10.719 { 00:10:10.719 "name": "BaseBdev2", 00:10:10.719 "uuid": "e34761be-6102-4fc8-8f38-820b4ed93f7c", 00:10:10.719 "is_configured": true, 00:10:10.719 "data_offset": 2048, 00:10:10.719 "data_size": 63488 00:10:10.719 } 00:10:10.719 ] 00:10:10.719 }' 00:10:10.719 13:38:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:10.719 13:38:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:11.291 [2024-06-10 13:38:25.691723] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:11.291 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:11.291 "name": "Existed_Raid", 00:10:11.291 "aliases": [ 00:10:11.291 "291402c4-d871-4b60-927b-c0b9bbb9ee18" 00:10:11.291 ], 00:10:11.291 "product_name": "Raid Volume", 00:10:11.291 "block_size": 512, 00:10:11.291 "num_blocks": 126976, 00:10:11.291 "uuid": "291402c4-d871-4b60-927b-c0b9bbb9ee18", 00:10:11.291 "assigned_rate_limits": { 00:10:11.291 "rw_ios_per_sec": 0, 00:10:11.291 "rw_mbytes_per_sec": 0, 00:10:11.291 "r_mbytes_per_sec": 0, 00:10:11.291 "w_mbytes_per_sec": 0 00:10:11.291 }, 00:10:11.291 "claimed": false, 00:10:11.291 "zoned": false, 00:10:11.291 "supported_io_types": { 00:10:11.291 "read": true, 00:10:11.291 "write": true, 00:10:11.291 "unmap": true, 00:10:11.291 "write_zeroes": true, 00:10:11.291 "flush": true, 00:10:11.291 "reset": true, 00:10:11.291 "compare": false, 00:10:11.291 "compare_and_write": false, 00:10:11.291 "abort": false, 00:10:11.291 "nvme_admin": false, 00:10:11.291 "nvme_io": false 00:10:11.291 }, 00:10:11.291 "memory_domains": [ 00:10:11.291 { 00:10:11.291 "dma_device_id": "system", 00:10:11.291 "dma_device_type": 1 00:10:11.291 }, 00:10:11.291 { 00:10:11.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:11.291 "dma_device_type": 2 00:10:11.291 }, 00:10:11.291 { 00:10:11.291 "dma_device_id": "system", 00:10:11.291 "dma_device_type": 1 00:10:11.291 }, 00:10:11.291 { 00:10:11.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:11.291 "dma_device_type": 2 00:10:11.291 } 00:10:11.291 ], 00:10:11.291 "driver_specific": { 00:10:11.291 "raid": { 00:10:11.291 "uuid": "291402c4-d871-4b60-927b-c0b9bbb9ee18", 00:10:11.291 "strip_size_kb": 64, 00:10:11.291 "state": "online", 00:10:11.291 "raid_level": "concat", 00:10:11.291 "superblock": true, 00:10:11.291 "num_base_bdevs": 2, 00:10:11.291 "num_base_bdevs_discovered": 2, 00:10:11.291 "num_base_bdevs_operational": 2, 00:10:11.291 "base_bdevs_list": [ 00:10:11.291 { 00:10:11.291 "name": "BaseBdev1", 00:10:11.291 "uuid": "d6b73ebb-b738-4981-9308-88d0d0a5a778", 00:10:11.291 "is_configured": true, 00:10:11.291 "data_offset": 2048, 00:10:11.292 "data_size": 63488 00:10:11.292 }, 00:10:11.292 { 00:10:11.292 "name": "BaseBdev2", 00:10:11.292 "uuid": "e34761be-6102-4fc8-8f38-820b4ed93f7c", 00:10:11.292 "is_configured": true, 00:10:11.292 "data_offset": 2048, 00:10:11.292 "data_size": 63488 00:10:11.292 } 00:10:11.292 ] 00:10:11.292 } 00:10:11.292 } 00:10:11.292 }' 00:10:11.292 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:11.292 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:11.292 BaseBdev2' 00:10:11.292 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:11.292 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:11.292 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:11.552 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:11.552 "name": "BaseBdev1", 00:10:11.552 "aliases": [ 00:10:11.552 "d6b73ebb-b738-4981-9308-88d0d0a5a778" 00:10:11.552 ], 00:10:11.552 "product_name": "Malloc disk", 00:10:11.552 "block_size": 512, 00:10:11.552 "num_blocks": 65536, 00:10:11.552 "uuid": "d6b73ebb-b738-4981-9308-88d0d0a5a778", 00:10:11.552 "assigned_rate_limits": { 00:10:11.552 "rw_ios_per_sec": 0, 00:10:11.552 "rw_mbytes_per_sec": 0, 00:10:11.552 "r_mbytes_per_sec": 0, 00:10:11.552 "w_mbytes_per_sec": 0 00:10:11.552 }, 00:10:11.552 "claimed": true, 00:10:11.552 "claim_type": "exclusive_write", 00:10:11.552 "zoned": false, 00:10:11.552 "supported_io_types": { 00:10:11.553 "read": true, 00:10:11.553 "write": true, 00:10:11.553 "unmap": true, 00:10:11.553 "write_zeroes": true, 00:10:11.553 "flush": true, 00:10:11.553 "reset": true, 00:10:11.553 "compare": false, 00:10:11.553 "compare_and_write": false, 00:10:11.553 "abort": true, 00:10:11.553 "nvme_admin": false, 00:10:11.553 "nvme_io": false 00:10:11.553 }, 00:10:11.553 "memory_domains": [ 00:10:11.553 { 00:10:11.553 "dma_device_id": "system", 00:10:11.553 "dma_device_type": 1 00:10:11.553 }, 00:10:11.553 { 00:10:11.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:11.553 "dma_device_type": 2 00:10:11.553 } 00:10:11.553 ], 00:10:11.553 "driver_specific": {} 00:10:11.553 }' 00:10:11.553 13:38:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:11.553 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:11.814 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:12.074 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:12.074 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:12.074 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:12.074 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:12.074 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:12.074 "name": "BaseBdev2", 00:10:12.074 "aliases": [ 00:10:12.074 "e34761be-6102-4fc8-8f38-820b4ed93f7c" 00:10:12.074 ], 00:10:12.074 "product_name": "Malloc disk", 00:10:12.074 "block_size": 512, 00:10:12.074 "num_blocks": 65536, 00:10:12.074 "uuid": "e34761be-6102-4fc8-8f38-820b4ed93f7c", 00:10:12.074 "assigned_rate_limits": { 00:10:12.074 "rw_ios_per_sec": 0, 00:10:12.074 "rw_mbytes_per_sec": 0, 00:10:12.074 "r_mbytes_per_sec": 0, 00:10:12.074 "w_mbytes_per_sec": 0 00:10:12.074 }, 00:10:12.074 "claimed": true, 00:10:12.074 "claim_type": "exclusive_write", 00:10:12.074 "zoned": false, 00:10:12.074 "supported_io_types": { 00:10:12.074 "read": true, 00:10:12.074 "write": true, 00:10:12.074 "unmap": true, 00:10:12.074 "write_zeroes": true, 00:10:12.074 "flush": true, 00:10:12.074 "reset": true, 00:10:12.074 "compare": false, 00:10:12.074 "compare_and_write": false, 00:10:12.074 "abort": true, 00:10:12.074 "nvme_admin": false, 00:10:12.075 "nvme_io": false 00:10:12.075 }, 00:10:12.075 "memory_domains": [ 00:10:12.075 { 00:10:12.075 "dma_device_id": "system", 00:10:12.075 "dma_device_type": 1 00:10:12.075 }, 00:10:12.075 { 00:10:12.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:12.075 "dma_device_type": 2 00:10:12.075 } 00:10:12.075 ], 00:10:12.075 "driver_specific": {} 00:10:12.075 }' 00:10:12.075 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:12.335 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:12.596 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:12.596 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:12.596 13:38:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:12.596 [2024-06-10 13:38:27.059039] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:12.596 [2024-06-10 13:38:27.059056] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:12.596 [2024-06-10 13:38:27.059088] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:12.856 "name": "Existed_Raid", 00:10:12.856 "uuid": "291402c4-d871-4b60-927b-c0b9bbb9ee18", 00:10:12.856 "strip_size_kb": 64, 00:10:12.856 "state": "offline", 00:10:12.856 "raid_level": "concat", 00:10:12.856 "superblock": true, 00:10:12.856 "num_base_bdevs": 2, 00:10:12.856 "num_base_bdevs_discovered": 1, 00:10:12.856 "num_base_bdevs_operational": 1, 00:10:12.856 "base_bdevs_list": [ 00:10:12.856 { 00:10:12.856 "name": null, 00:10:12.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:12.856 "is_configured": false, 00:10:12.856 "data_offset": 2048, 00:10:12.856 "data_size": 63488 00:10:12.856 }, 00:10:12.856 { 00:10:12.856 "name": "BaseBdev2", 00:10:12.856 "uuid": "e34761be-6102-4fc8-8f38-820b4ed93f7c", 00:10:12.856 "is_configured": true, 00:10:12.856 "data_offset": 2048, 00:10:12.856 "data_size": 63488 00:10:12.856 } 00:10:12.856 ] 00:10:12.856 }' 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:12.856 13:38:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:13.425 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:13.425 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:13.425 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:13.425 13:38:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:13.685 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:13.685 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:13.685 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:13.947 [2024-06-10 13:38:28.226005] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:13.947 [2024-06-10 13:38:28.226044] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb09e00 name Existed_Raid, state offline 00:10:13.947 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:13.947 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:13.947 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:13.947 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1496860 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1496860 ']' 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1496860 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1496860 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1496860' 00:10:14.208 killing process with pid 1496860 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1496860 00:10:14.208 [2024-06-10 13:38:28.520853] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1496860 00:10:14.208 [2024-06-10 13:38:28.521467] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:14.208 00:10:14.208 real 0m8.603s 00:10:14.208 user 0m15.582s 00:10:14.208 sys 0m1.332s 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:14.208 13:38:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:14.208 ************************************ 00:10:14.208 END TEST raid_state_function_test_sb 00:10:14.208 ************************************ 00:10:14.470 13:38:28 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:10:14.470 13:38:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:10:14.470 13:38:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:14.470 13:38:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:14.470 ************************************ 00:10:14.470 START TEST raid_superblock_test 00:10:14.470 ************************************ 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 2 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1498810 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1498810 /var/tmp/spdk-raid.sock 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1498810 ']' 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:14.470 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:14.470 13:38:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:14.470 [2024-06-10 13:38:28.793096] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:14.470 [2024-06-10 13:38:28.793172] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1498810 ] 00:10:14.470 [2024-06-10 13:38:28.885864] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.731 [2024-06-10 13:38:28.955720] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.731 [2024-06-10 13:38:29.000240] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:14.731 [2024-06-10 13:38:29.000264] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:15.302 13:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:15.302 13:38:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:10:15.302 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:15.302 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:15.302 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:15.302 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:15.303 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:15.303 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:15.303 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:15.303 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:15.303 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:15.563 malloc1 00:10:15.563 13:38:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:15.563 [2024-06-10 13:38:30.011389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:15.563 [2024-06-10 13:38:30.011427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:15.563 [2024-06-10 13:38:30.011442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1af0550 00:10:15.563 [2024-06-10 13:38:30.011450] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:15.563 [2024-06-10 13:38:30.012805] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:15.563 [2024-06-10 13:38:30.012825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:15.563 pt1 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:15.563 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:15.822 malloc2 00:10:15.822 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:16.083 [2024-06-10 13:38:30.434613] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:16.083 [2024-06-10 13:38:30.434645] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:16.083 [2024-06-10 13:38:30.434657] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb20f0 00:10:16.083 [2024-06-10 13:38:30.434664] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:16.083 [2024-06-10 13:38:30.435942] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:16.083 [2024-06-10 13:38:30.435963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:16.083 pt2 00:10:16.083 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:16.083 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:16.083 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:10:16.343 [2024-06-10 13:38:30.623102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:16.343 [2024-06-10 13:38:30.624175] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:16.343 [2024-06-10 13:38:30.624291] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bbc690 00:10:16.343 [2024-06-10 13:38:30.624299] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:16.343 [2024-06-10 13:38:30.624451] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb42e0 00:10:16.343 [2024-06-10 13:38:30.624569] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bbc690 00:10:16.343 [2024-06-10 13:38:30.624575] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bbc690 00:10:16.343 [2024-06-10 13:38:30.624652] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:16.343 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:16.344 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:16.344 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:16.344 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:16.344 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:16.344 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:16.603 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:16.603 "name": "raid_bdev1", 00:10:16.603 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:16.603 "strip_size_kb": 64, 00:10:16.603 "state": "online", 00:10:16.603 "raid_level": "concat", 00:10:16.603 "superblock": true, 00:10:16.603 "num_base_bdevs": 2, 00:10:16.603 "num_base_bdevs_discovered": 2, 00:10:16.603 "num_base_bdevs_operational": 2, 00:10:16.603 "base_bdevs_list": [ 00:10:16.603 { 00:10:16.603 "name": "pt1", 00:10:16.603 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:16.603 "is_configured": true, 00:10:16.603 "data_offset": 2048, 00:10:16.603 "data_size": 63488 00:10:16.603 }, 00:10:16.603 { 00:10:16.603 "name": "pt2", 00:10:16.603 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:16.603 "is_configured": true, 00:10:16.603 "data_offset": 2048, 00:10:16.603 "data_size": 63488 00:10:16.603 } 00:10:16.603 ] 00:10:16.603 }' 00:10:16.603 13:38:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:16.603 13:38:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:17.174 [2024-06-10 13:38:31.585699] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:17.174 "name": "raid_bdev1", 00:10:17.174 "aliases": [ 00:10:17.174 "92d6026a-fd79-4325-bd6a-2bc16028013d" 00:10:17.174 ], 00:10:17.174 "product_name": "Raid Volume", 00:10:17.174 "block_size": 512, 00:10:17.174 "num_blocks": 126976, 00:10:17.174 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:17.174 "assigned_rate_limits": { 00:10:17.174 "rw_ios_per_sec": 0, 00:10:17.174 "rw_mbytes_per_sec": 0, 00:10:17.174 "r_mbytes_per_sec": 0, 00:10:17.174 "w_mbytes_per_sec": 0 00:10:17.174 }, 00:10:17.174 "claimed": false, 00:10:17.174 "zoned": false, 00:10:17.174 "supported_io_types": { 00:10:17.174 "read": true, 00:10:17.174 "write": true, 00:10:17.174 "unmap": true, 00:10:17.174 "write_zeroes": true, 00:10:17.174 "flush": true, 00:10:17.174 "reset": true, 00:10:17.174 "compare": false, 00:10:17.174 "compare_and_write": false, 00:10:17.174 "abort": false, 00:10:17.174 "nvme_admin": false, 00:10:17.174 "nvme_io": false 00:10:17.174 }, 00:10:17.174 "memory_domains": [ 00:10:17.174 { 00:10:17.174 "dma_device_id": "system", 00:10:17.174 "dma_device_type": 1 00:10:17.174 }, 00:10:17.174 { 00:10:17.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.174 "dma_device_type": 2 00:10:17.174 }, 00:10:17.174 { 00:10:17.174 "dma_device_id": "system", 00:10:17.174 "dma_device_type": 1 00:10:17.174 }, 00:10:17.174 { 00:10:17.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.174 "dma_device_type": 2 00:10:17.174 } 00:10:17.174 ], 00:10:17.174 "driver_specific": { 00:10:17.174 "raid": { 00:10:17.174 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:17.174 "strip_size_kb": 64, 00:10:17.174 "state": "online", 00:10:17.174 "raid_level": "concat", 00:10:17.174 "superblock": true, 00:10:17.174 "num_base_bdevs": 2, 00:10:17.174 "num_base_bdevs_discovered": 2, 00:10:17.174 "num_base_bdevs_operational": 2, 00:10:17.174 "base_bdevs_list": [ 00:10:17.174 { 00:10:17.174 "name": "pt1", 00:10:17.174 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:17.174 "is_configured": true, 00:10:17.174 "data_offset": 2048, 00:10:17.174 "data_size": 63488 00:10:17.174 }, 00:10:17.174 { 00:10:17.174 "name": "pt2", 00:10:17.174 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:17.174 "is_configured": true, 00:10:17.174 "data_offset": 2048, 00:10:17.174 "data_size": 63488 00:10:17.174 } 00:10:17.174 ] 00:10:17.174 } 00:10:17.174 } 00:10:17.174 }' 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:17.174 pt2' 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:17.174 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:17.434 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:17.434 "name": "pt1", 00:10:17.434 "aliases": [ 00:10:17.434 "00000000-0000-0000-0000-000000000001" 00:10:17.434 ], 00:10:17.434 "product_name": "passthru", 00:10:17.434 "block_size": 512, 00:10:17.434 "num_blocks": 65536, 00:10:17.434 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:17.434 "assigned_rate_limits": { 00:10:17.434 "rw_ios_per_sec": 0, 00:10:17.434 "rw_mbytes_per_sec": 0, 00:10:17.434 "r_mbytes_per_sec": 0, 00:10:17.434 "w_mbytes_per_sec": 0 00:10:17.434 }, 00:10:17.434 "claimed": true, 00:10:17.434 "claim_type": "exclusive_write", 00:10:17.434 "zoned": false, 00:10:17.434 "supported_io_types": { 00:10:17.434 "read": true, 00:10:17.434 "write": true, 00:10:17.434 "unmap": true, 00:10:17.434 "write_zeroes": true, 00:10:17.434 "flush": true, 00:10:17.434 "reset": true, 00:10:17.434 "compare": false, 00:10:17.434 "compare_and_write": false, 00:10:17.434 "abort": true, 00:10:17.434 "nvme_admin": false, 00:10:17.434 "nvme_io": false 00:10:17.434 }, 00:10:17.434 "memory_domains": [ 00:10:17.434 { 00:10:17.434 "dma_device_id": "system", 00:10:17.434 "dma_device_type": 1 00:10:17.434 }, 00:10:17.434 { 00:10:17.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.434 "dma_device_type": 2 00:10:17.434 } 00:10:17.434 ], 00:10:17.434 "driver_specific": { 00:10:17.434 "passthru": { 00:10:17.434 "name": "pt1", 00:10:17.434 "base_bdev_name": "malloc1" 00:10:17.434 } 00:10:17.434 } 00:10:17.434 }' 00:10:17.434 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:17.434 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:17.694 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:17.694 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:17.694 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:17.694 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:17.694 13:38:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:17.694 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:17.694 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:17.694 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:17.694 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:17.955 "name": "pt2", 00:10:17.955 "aliases": [ 00:10:17.955 "00000000-0000-0000-0000-000000000002" 00:10:17.955 ], 00:10:17.955 "product_name": "passthru", 00:10:17.955 "block_size": 512, 00:10:17.955 "num_blocks": 65536, 00:10:17.955 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:17.955 "assigned_rate_limits": { 00:10:17.955 "rw_ios_per_sec": 0, 00:10:17.955 "rw_mbytes_per_sec": 0, 00:10:17.955 "r_mbytes_per_sec": 0, 00:10:17.955 "w_mbytes_per_sec": 0 00:10:17.955 }, 00:10:17.955 "claimed": true, 00:10:17.955 "claim_type": "exclusive_write", 00:10:17.955 "zoned": false, 00:10:17.955 "supported_io_types": { 00:10:17.955 "read": true, 00:10:17.955 "write": true, 00:10:17.955 "unmap": true, 00:10:17.955 "write_zeroes": true, 00:10:17.955 "flush": true, 00:10:17.955 "reset": true, 00:10:17.955 "compare": false, 00:10:17.955 "compare_and_write": false, 00:10:17.955 "abort": true, 00:10:17.955 "nvme_admin": false, 00:10:17.955 "nvme_io": false 00:10:17.955 }, 00:10:17.955 "memory_domains": [ 00:10:17.955 { 00:10:17.955 "dma_device_id": "system", 00:10:17.955 "dma_device_type": 1 00:10:17.955 }, 00:10:17.955 { 00:10:17.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:17.955 "dma_device_type": 2 00:10:17.955 } 00:10:17.955 ], 00:10:17.955 "driver_specific": { 00:10:17.955 "passthru": { 00:10:17.955 "name": "pt2", 00:10:17.955 "base_bdev_name": "malloc2" 00:10:17.955 } 00:10:17.955 } 00:10:17.955 }' 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:17.955 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:18.215 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:18.475 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:18.475 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:18.475 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:18.475 [2024-06-10 13:38:32.913070] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:18.475 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=92d6026a-fd79-4325-bd6a-2bc16028013d 00:10:18.476 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 92d6026a-fd79-4325-bd6a-2bc16028013d ']' 00:10:18.476 13:38:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:18.735 [2024-06-10 13:38:33.117403] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:18.735 [2024-06-10 13:38:33.117417] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:18.735 [2024-06-10 13:38:33.117459] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:18.735 [2024-06-10 13:38:33.117495] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:18.736 [2024-06-10 13:38:33.117501] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbc690 name raid_bdev1, state offline 00:10:18.736 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:18.736 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:18.995 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:18.995 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:18.995 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:18.995 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:19.256 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:19.256 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:19.517 13:38:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:19.778 [2024-06-10 13:38:34.139965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:19.778 [2024-06-10 13:38:34.141111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:19.778 [2024-06-10 13:38:34.141156] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:19.778 [2024-06-10 13:38:34.141193] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:19.778 [2024-06-10 13:38:34.141205] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:19.778 [2024-06-10 13:38:34.141210] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb2c10 name raid_bdev1, state configuring 00:10:19.778 request: 00:10:19.778 { 00:10:19.778 "name": "raid_bdev1", 00:10:19.778 "raid_level": "concat", 00:10:19.778 "base_bdevs": [ 00:10:19.778 "malloc1", 00:10:19.778 "malloc2" 00:10:19.778 ], 00:10:19.778 "superblock": false, 00:10:19.778 "strip_size_kb": 64, 00:10:19.778 "method": "bdev_raid_create", 00:10:19.778 "req_id": 1 00:10:19.778 } 00:10:19.778 Got JSON-RPC error response 00:10:19.778 response: 00:10:19.778 { 00:10:19.778 "code": -17, 00:10:19.778 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:19.778 } 00:10:19.778 13:38:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:10:19.778 13:38:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:10:19.778 13:38:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:10:19.778 13:38:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:10:19.778 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:19.778 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:20.039 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:20.039 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:20.039 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:20.300 [2024-06-10 13:38:34.548953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:20.300 [2024-06-10 13:38:34.548975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:20.301 [2024-06-10 13:38:34.548986] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1af37c0 00:10:20.301 [2024-06-10 13:38:34.548992] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:20.301 [2024-06-10 13:38:34.550323] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:20.301 [2024-06-10 13:38:34.550341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:20.301 [2024-06-10 13:38:34.550385] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:20.301 [2024-06-10 13:38:34.550402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:20.301 pt1 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.301 "name": "raid_bdev1", 00:10:20.301 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:20.301 "strip_size_kb": 64, 00:10:20.301 "state": "configuring", 00:10:20.301 "raid_level": "concat", 00:10:20.301 "superblock": true, 00:10:20.301 "num_base_bdevs": 2, 00:10:20.301 "num_base_bdevs_discovered": 1, 00:10:20.301 "num_base_bdevs_operational": 2, 00:10:20.301 "base_bdevs_list": [ 00:10:20.301 { 00:10:20.301 "name": "pt1", 00:10:20.301 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:20.301 "is_configured": true, 00:10:20.301 "data_offset": 2048, 00:10:20.301 "data_size": 63488 00:10:20.301 }, 00:10:20.301 { 00:10:20.301 "name": null, 00:10:20.301 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:20.301 "is_configured": false, 00:10:20.301 "data_offset": 2048, 00:10:20.301 "data_size": 63488 00:10:20.301 } 00:10:20.301 ] 00:10:20.301 }' 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.301 13:38:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.872 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:20.872 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:20.872 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:20.872 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:21.133 [2024-06-10 13:38:35.507406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:21.133 [2024-06-10 13:38:35.507438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:21.133 [2024-06-10 13:38:35.507449] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1af3340 00:10:21.133 [2024-06-10 13:38:35.507456] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:21.133 [2024-06-10 13:38:35.507737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:21.133 [2024-06-10 13:38:35.507749] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:21.133 [2024-06-10 13:38:35.507794] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:21.133 [2024-06-10 13:38:35.507807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:21.133 [2024-06-10 13:38:35.507881] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1af2d50 00:10:21.133 [2024-06-10 13:38:35.507887] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:21.133 [2024-06-10 13:38:35.508026] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1af0220 00:10:21.133 [2024-06-10 13:38:35.508128] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1af2d50 00:10:21.133 [2024-06-10 13:38:35.508133] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1af2d50 00:10:21.133 [2024-06-10 13:38:35.508217] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:21.133 pt2 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:21.133 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:21.394 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:21.394 "name": "raid_bdev1", 00:10:21.394 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:21.394 "strip_size_kb": 64, 00:10:21.394 "state": "online", 00:10:21.394 "raid_level": "concat", 00:10:21.394 "superblock": true, 00:10:21.394 "num_base_bdevs": 2, 00:10:21.394 "num_base_bdevs_discovered": 2, 00:10:21.395 "num_base_bdevs_operational": 2, 00:10:21.395 "base_bdevs_list": [ 00:10:21.395 { 00:10:21.395 "name": "pt1", 00:10:21.395 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:21.395 "is_configured": true, 00:10:21.395 "data_offset": 2048, 00:10:21.395 "data_size": 63488 00:10:21.395 }, 00:10:21.395 { 00:10:21.395 "name": "pt2", 00:10:21.395 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:21.395 "is_configured": true, 00:10:21.395 "data_offset": 2048, 00:10:21.395 "data_size": 63488 00:10:21.395 } 00:10:21.395 ] 00:10:21.395 }' 00:10:21.395 13:38:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:21.395 13:38:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:21.966 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:22.227 [2024-06-10 13:38:36.486086] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:22.227 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:22.227 "name": "raid_bdev1", 00:10:22.227 "aliases": [ 00:10:22.227 "92d6026a-fd79-4325-bd6a-2bc16028013d" 00:10:22.227 ], 00:10:22.227 "product_name": "Raid Volume", 00:10:22.227 "block_size": 512, 00:10:22.227 "num_blocks": 126976, 00:10:22.227 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:22.227 "assigned_rate_limits": { 00:10:22.227 "rw_ios_per_sec": 0, 00:10:22.227 "rw_mbytes_per_sec": 0, 00:10:22.227 "r_mbytes_per_sec": 0, 00:10:22.227 "w_mbytes_per_sec": 0 00:10:22.227 }, 00:10:22.227 "claimed": false, 00:10:22.227 "zoned": false, 00:10:22.227 "supported_io_types": { 00:10:22.227 "read": true, 00:10:22.227 "write": true, 00:10:22.227 "unmap": true, 00:10:22.227 "write_zeroes": true, 00:10:22.227 "flush": true, 00:10:22.227 "reset": true, 00:10:22.227 "compare": false, 00:10:22.227 "compare_and_write": false, 00:10:22.227 "abort": false, 00:10:22.227 "nvme_admin": false, 00:10:22.227 "nvme_io": false 00:10:22.227 }, 00:10:22.227 "memory_domains": [ 00:10:22.227 { 00:10:22.227 "dma_device_id": "system", 00:10:22.227 "dma_device_type": 1 00:10:22.227 }, 00:10:22.227 { 00:10:22.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.227 "dma_device_type": 2 00:10:22.227 }, 00:10:22.227 { 00:10:22.227 "dma_device_id": "system", 00:10:22.227 "dma_device_type": 1 00:10:22.227 }, 00:10:22.227 { 00:10:22.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.227 "dma_device_type": 2 00:10:22.227 } 00:10:22.227 ], 00:10:22.227 "driver_specific": { 00:10:22.227 "raid": { 00:10:22.227 "uuid": "92d6026a-fd79-4325-bd6a-2bc16028013d", 00:10:22.227 "strip_size_kb": 64, 00:10:22.227 "state": "online", 00:10:22.227 "raid_level": "concat", 00:10:22.227 "superblock": true, 00:10:22.227 "num_base_bdevs": 2, 00:10:22.227 "num_base_bdevs_discovered": 2, 00:10:22.228 "num_base_bdevs_operational": 2, 00:10:22.228 "base_bdevs_list": [ 00:10:22.228 { 00:10:22.228 "name": "pt1", 00:10:22.228 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:22.228 "is_configured": true, 00:10:22.228 "data_offset": 2048, 00:10:22.228 "data_size": 63488 00:10:22.228 }, 00:10:22.228 { 00:10:22.228 "name": "pt2", 00:10:22.228 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:22.228 "is_configured": true, 00:10:22.228 "data_offset": 2048, 00:10:22.228 "data_size": 63488 00:10:22.228 } 00:10:22.228 ] 00:10:22.228 } 00:10:22.228 } 00:10:22.228 }' 00:10:22.228 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:22.228 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:22.228 pt2' 00:10:22.228 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:22.228 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:22.228 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:22.488 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:22.488 "name": "pt1", 00:10:22.488 "aliases": [ 00:10:22.488 "00000000-0000-0000-0000-000000000001" 00:10:22.488 ], 00:10:22.488 "product_name": "passthru", 00:10:22.488 "block_size": 512, 00:10:22.488 "num_blocks": 65536, 00:10:22.488 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:22.488 "assigned_rate_limits": { 00:10:22.488 "rw_ios_per_sec": 0, 00:10:22.488 "rw_mbytes_per_sec": 0, 00:10:22.488 "r_mbytes_per_sec": 0, 00:10:22.488 "w_mbytes_per_sec": 0 00:10:22.488 }, 00:10:22.488 "claimed": true, 00:10:22.488 "claim_type": "exclusive_write", 00:10:22.488 "zoned": false, 00:10:22.488 "supported_io_types": { 00:10:22.488 "read": true, 00:10:22.488 "write": true, 00:10:22.488 "unmap": true, 00:10:22.488 "write_zeroes": true, 00:10:22.488 "flush": true, 00:10:22.488 "reset": true, 00:10:22.488 "compare": false, 00:10:22.488 "compare_and_write": false, 00:10:22.488 "abort": true, 00:10:22.488 "nvme_admin": false, 00:10:22.488 "nvme_io": false 00:10:22.488 }, 00:10:22.488 "memory_domains": [ 00:10:22.488 { 00:10:22.488 "dma_device_id": "system", 00:10:22.488 "dma_device_type": 1 00:10:22.488 }, 00:10:22.489 { 00:10:22.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.489 "dma_device_type": 2 00:10:22.489 } 00:10:22.489 ], 00:10:22.489 "driver_specific": { 00:10:22.489 "passthru": { 00:10:22.489 "name": "pt1", 00:10:22.489 "base_bdev_name": "malloc1" 00:10:22.489 } 00:10:22.489 } 00:10:22.489 }' 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:22.489 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:22.749 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:22.749 13:38:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:22.749 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:22.749 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:22.749 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:22.749 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:22.749 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:23.010 "name": "pt2", 00:10:23.010 "aliases": [ 00:10:23.010 "00000000-0000-0000-0000-000000000002" 00:10:23.010 ], 00:10:23.010 "product_name": "passthru", 00:10:23.010 "block_size": 512, 00:10:23.010 "num_blocks": 65536, 00:10:23.010 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:23.010 "assigned_rate_limits": { 00:10:23.010 "rw_ios_per_sec": 0, 00:10:23.010 "rw_mbytes_per_sec": 0, 00:10:23.010 "r_mbytes_per_sec": 0, 00:10:23.010 "w_mbytes_per_sec": 0 00:10:23.010 }, 00:10:23.010 "claimed": true, 00:10:23.010 "claim_type": "exclusive_write", 00:10:23.010 "zoned": false, 00:10:23.010 "supported_io_types": { 00:10:23.010 "read": true, 00:10:23.010 "write": true, 00:10:23.010 "unmap": true, 00:10:23.010 "write_zeroes": true, 00:10:23.010 "flush": true, 00:10:23.010 "reset": true, 00:10:23.010 "compare": false, 00:10:23.010 "compare_and_write": false, 00:10:23.010 "abort": true, 00:10:23.010 "nvme_admin": false, 00:10:23.010 "nvme_io": false 00:10:23.010 }, 00:10:23.010 "memory_domains": [ 00:10:23.010 { 00:10:23.010 "dma_device_id": "system", 00:10:23.010 "dma_device_type": 1 00:10:23.010 }, 00:10:23.010 { 00:10:23.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:23.010 "dma_device_type": 2 00:10:23.010 } 00:10:23.010 ], 00:10:23.010 "driver_specific": { 00:10:23.010 "passthru": { 00:10:23.010 "name": "pt2", 00:10:23.010 "base_bdev_name": "malloc2" 00:10:23.010 } 00:10:23.010 } 00:10:23.010 }' 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:23.010 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:23.270 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:23.270 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:23.270 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:23.270 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:23.270 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:23.270 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:23.531 [2024-06-10 13:38:37.753307] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 92d6026a-fd79-4325-bd6a-2bc16028013d '!=' 92d6026a-fd79-4325-bd6a-2bc16028013d ']' 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1498810 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1498810 ']' 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1498810 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1498810 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1498810' 00:10:23.531 killing process with pid 1498810 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1498810 00:10:23.531 [2024-06-10 13:38:37.824017] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:23.531 [2024-06-10 13:38:37.824063] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:23.531 [2024-06-10 13:38:37.824099] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:23.531 [2024-06-10 13:38:37.824105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1af2d50 name raid_bdev1, state offline 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1498810 00:10:23.531 [2024-06-10 13:38:37.833714] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:23.531 00:10:23.531 real 0m9.230s 00:10:23.531 user 0m16.797s 00:10:23.531 sys 0m1.411s 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:23.531 13:38:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:23.531 ************************************ 00:10:23.531 END TEST raid_superblock_test 00:10:23.531 ************************************ 00:10:23.531 13:38:37 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:10:23.531 13:38:37 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:23.531 13:38:37 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:23.531 13:38:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:23.792 ************************************ 00:10:23.792 START TEST raid_read_error_test 00:10:23.792 ************************************ 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 read 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vzfPb0MPJa 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1500952 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1500952 /var/tmp/spdk-raid.sock 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1500952 ']' 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:23.792 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:23.792 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:23.792 [2024-06-10 13:38:38.102777] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:23.792 [2024-06-10 13:38:38.102833] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1500952 ] 00:10:23.792 [2024-06-10 13:38:38.191199] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:23.792 [2024-06-10 13:38:38.260430] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:24.053 [2024-06-10 13:38:38.302404] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:24.053 [2024-06-10 13:38:38.302426] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:24.624 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:24.625 13:38:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:24.625 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:24.625 13:38:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:24.888 BaseBdev1_malloc 00:10:24.888 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:24.888 true 00:10:25.148 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:25.148 [2024-06-10 13:38:39.550011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:25.148 [2024-06-10 13:38:39.550042] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:25.148 [2024-06-10 13:38:39.550055] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f0ac90 00:10:25.148 [2024-06-10 13:38:39.550062] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:25.148 [2024-06-10 13:38:39.551613] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:25.148 [2024-06-10 13:38:39.551634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:25.148 BaseBdev1 00:10:25.148 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:25.148 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:25.408 BaseBdev2_malloc 00:10:25.408 13:38:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:25.668 true 00:10:25.668 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:25.668 [2024-06-10 13:38:40.141542] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:25.668 [2024-06-10 13:38:40.141578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:25.668 [2024-06-10 13:38:40.141591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f0f400 00:10:25.668 [2024-06-10 13:38:40.141599] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:25.668 [2024-06-10 13:38:40.142987] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:25.668 [2024-06-10 13:38:40.143008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:25.929 BaseBdev2 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:25.929 [2024-06-10 13:38:40.362108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:25.929 [2024-06-10 13:38:40.363193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:25.929 [2024-06-10 13:38:40.363341] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f0ee20 00:10:25.929 [2024-06-10 13:38:40.363349] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:25.929 [2024-06-10 13:38:40.363510] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d60380 00:10:25.929 [2024-06-10 13:38:40.363627] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f0ee20 00:10:25.929 [2024-06-10 13:38:40.363633] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f0ee20 00:10:25.929 [2024-06-10 13:38:40.363713] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.929 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:26.189 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:26.189 "name": "raid_bdev1", 00:10:26.189 "uuid": "66b62bc0-1b9f-4e44-a0f3-3fe14d35eafb", 00:10:26.189 "strip_size_kb": 64, 00:10:26.189 "state": "online", 00:10:26.189 "raid_level": "concat", 00:10:26.189 "superblock": true, 00:10:26.189 "num_base_bdevs": 2, 00:10:26.189 "num_base_bdevs_discovered": 2, 00:10:26.189 "num_base_bdevs_operational": 2, 00:10:26.189 "base_bdevs_list": [ 00:10:26.189 { 00:10:26.189 "name": "BaseBdev1", 00:10:26.189 "uuid": "2491c7cc-8bfc-502c-9aec-d7185377ee66", 00:10:26.189 "is_configured": true, 00:10:26.189 "data_offset": 2048, 00:10:26.189 "data_size": 63488 00:10:26.189 }, 00:10:26.189 { 00:10:26.189 "name": "BaseBdev2", 00:10:26.189 "uuid": "58d6f6cc-3bcd-5095-8f25-101822de1c4b", 00:10:26.189 "is_configured": true, 00:10:26.189 "data_offset": 2048, 00:10:26.189 "data_size": 63488 00:10:26.189 } 00:10:26.189 ] 00:10:26.189 }' 00:10:26.189 13:38:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:26.189 13:38:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:26.759 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:26.759 13:38:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:27.019 [2024-06-10 13:38:41.240540] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f13070 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.958 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:28.218 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:28.218 "name": "raid_bdev1", 00:10:28.218 "uuid": "66b62bc0-1b9f-4e44-a0f3-3fe14d35eafb", 00:10:28.218 "strip_size_kb": 64, 00:10:28.218 "state": "online", 00:10:28.218 "raid_level": "concat", 00:10:28.218 "superblock": true, 00:10:28.218 "num_base_bdevs": 2, 00:10:28.218 "num_base_bdevs_discovered": 2, 00:10:28.218 "num_base_bdevs_operational": 2, 00:10:28.218 "base_bdevs_list": [ 00:10:28.218 { 00:10:28.218 "name": "BaseBdev1", 00:10:28.218 "uuid": "2491c7cc-8bfc-502c-9aec-d7185377ee66", 00:10:28.218 "is_configured": true, 00:10:28.218 "data_offset": 2048, 00:10:28.218 "data_size": 63488 00:10:28.218 }, 00:10:28.218 { 00:10:28.218 "name": "BaseBdev2", 00:10:28.218 "uuid": "58d6f6cc-3bcd-5095-8f25-101822de1c4b", 00:10:28.218 "is_configured": true, 00:10:28.218 "data_offset": 2048, 00:10:28.218 "data_size": 63488 00:10:28.218 } 00:10:28.218 ] 00:10:28.218 }' 00:10:28.218 13:38:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:28.218 13:38:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.789 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:29.049 [2024-06-10 13:38:43.324215] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:29.049 [2024-06-10 13:38:43.324244] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:29.049 [2024-06-10 13:38:43.327040] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:29.049 [2024-06-10 13:38:43.327064] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:29.049 [2024-06-10 13:38:43.327083] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:29.049 [2024-06-10 13:38:43.327089] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0ee20 name raid_bdev1, state offline 00:10:29.049 0 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1500952 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1500952 ']' 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1500952 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1500952 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1500952' 00:10:29.049 killing process with pid 1500952 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1500952 00:10:29.049 [2024-06-10 13:38:43.392200] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:29.049 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1500952 00:10:29.049 [2024-06-10 13:38:43.397736] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:29.309 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vzfPb0MPJa 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.48 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.48 != \0\.\0\0 ]] 00:10:29.310 00:10:29.310 real 0m5.500s 00:10:29.310 user 0m8.666s 00:10:29.310 sys 0m0.803s 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:29.310 13:38:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.310 ************************************ 00:10:29.310 END TEST raid_read_error_test 00:10:29.310 ************************************ 00:10:29.310 13:38:43 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:10:29.310 13:38:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:29.310 13:38:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:29.310 13:38:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:29.310 ************************************ 00:10:29.310 START TEST raid_write_error_test 00:10:29.310 ************************************ 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 write 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0Zuef3uV7n 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1502044 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1502044 /var/tmp/spdk-raid.sock 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1502044 ']' 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:29.310 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:29.310 13:38:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.310 [2024-06-10 13:38:43.675628] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:29.310 [2024-06-10 13:38:43.675673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1502044 ] 00:10:29.310 [2024-06-10 13:38:43.764714] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:29.571 [2024-06-10 13:38:43.830079] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.571 [2024-06-10 13:38:43.874745] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.571 [2024-06-10 13:38:43.874770] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:30.143 13:38:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:30.143 13:38:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:30.143 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:30.143 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:30.403 BaseBdev1_malloc 00:10:30.403 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:30.663 true 00:10:30.663 13:38:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:30.663 [2024-06-10 13:38:45.106271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:30.663 [2024-06-10 13:38:45.106307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:30.663 [2024-06-10 13:38:45.106319] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dbec90 00:10:30.663 [2024-06-10 13:38:45.106326] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:30.663 [2024-06-10 13:38:45.107771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:30.663 [2024-06-10 13:38:45.107790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:30.663 BaseBdev1 00:10:30.663 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:30.663 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:30.924 BaseBdev2_malloc 00:10:30.924 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:31.187 true 00:10:31.187 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:31.510 [2024-06-10 13:38:45.697633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:31.510 [2024-06-10 13:38:45.697663] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:31.510 [2024-06-10 13:38:45.697675] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dc3400 00:10:31.510 [2024-06-10 13:38:45.697681] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:31.510 [2024-06-10 13:38:45.698918] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:31.510 [2024-06-10 13:38:45.698937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:31.510 BaseBdev2 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:31.510 [2024-06-10 13:38:45.902176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.510 [2024-06-10 13:38:45.903224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:31.510 [2024-06-10 13:38:45.903369] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dc2e20 00:10:31.510 [2024-06-10 13:38:45.903377] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:31.510 [2024-06-10 13:38:45.903527] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c14380 00:10:31.510 [2024-06-10 13:38:45.903644] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dc2e20 00:10:31.510 [2024-06-10 13:38:45.903650] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1dc2e20 00:10:31.510 [2024-06-10 13:38:45.903727] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.510 13:38:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:31.776 13:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:31.776 "name": "raid_bdev1", 00:10:31.776 "uuid": "39f612fd-5657-472e-8f20-995157d44f0c", 00:10:31.776 "strip_size_kb": 64, 00:10:31.776 "state": "online", 00:10:31.776 "raid_level": "concat", 00:10:31.776 "superblock": true, 00:10:31.776 "num_base_bdevs": 2, 00:10:31.776 "num_base_bdevs_discovered": 2, 00:10:31.776 "num_base_bdevs_operational": 2, 00:10:31.776 "base_bdevs_list": [ 00:10:31.776 { 00:10:31.776 "name": "BaseBdev1", 00:10:31.776 "uuid": "c4cefc09-5280-5d0e-abaa-b18397c6a5da", 00:10:31.776 "is_configured": true, 00:10:31.776 "data_offset": 2048, 00:10:31.776 "data_size": 63488 00:10:31.776 }, 00:10:31.776 { 00:10:31.776 "name": "BaseBdev2", 00:10:31.776 "uuid": "cbdfbb6c-5968-5e12-b85a-117ab3ea056e", 00:10:31.776 "is_configured": true, 00:10:31.776 "data_offset": 2048, 00:10:31.776 "data_size": 63488 00:10:31.776 } 00:10:31.776 ] 00:10:31.776 }' 00:10:31.776 13:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:31.776 13:38:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:32.346 13:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:32.346 13:38:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:32.346 [2024-06-10 13:38:46.752537] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dc7070 00:10:33.287 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.547 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.548 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.548 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.548 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.548 13:38:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:33.808 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:33.808 "name": "raid_bdev1", 00:10:33.808 "uuid": "39f612fd-5657-472e-8f20-995157d44f0c", 00:10:33.808 "strip_size_kb": 64, 00:10:33.808 "state": "online", 00:10:33.808 "raid_level": "concat", 00:10:33.808 "superblock": true, 00:10:33.808 "num_base_bdevs": 2, 00:10:33.808 "num_base_bdevs_discovered": 2, 00:10:33.808 "num_base_bdevs_operational": 2, 00:10:33.808 "base_bdevs_list": [ 00:10:33.808 { 00:10:33.808 "name": "BaseBdev1", 00:10:33.808 "uuid": "c4cefc09-5280-5d0e-abaa-b18397c6a5da", 00:10:33.808 "is_configured": true, 00:10:33.808 "data_offset": 2048, 00:10:33.808 "data_size": 63488 00:10:33.808 }, 00:10:33.808 { 00:10:33.808 "name": "BaseBdev2", 00:10:33.808 "uuid": "cbdfbb6c-5968-5e12-b85a-117ab3ea056e", 00:10:33.808 "is_configured": true, 00:10:33.808 "data_offset": 2048, 00:10:33.808 "data_size": 63488 00:10:33.808 } 00:10:33.808 ] 00:10:33.808 }' 00:10:33.808 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:33.808 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:34.379 [2024-06-10 13:38:48.807367] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:34.379 [2024-06-10 13:38:48.807400] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:34.379 [2024-06-10 13:38:48.810208] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:34.379 [2024-06-10 13:38:48.810233] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:34.379 [2024-06-10 13:38:48.810252] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:34.379 [2024-06-10 13:38:48.810258] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dc2e20 name raid_bdev1, state offline 00:10:34.379 0 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1502044 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1502044 ']' 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1502044 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:34.379 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1502044 00:10:34.640 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:34.640 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:34.640 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1502044' 00:10:34.640 killing process with pid 1502044 00:10:34.640 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1502044 00:10:34.640 [2024-06-10 13:38:48.876797] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:34.640 13:38:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1502044 00:10:34.640 [2024-06-10 13:38:48.882318] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0Zuef3uV7n 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:10:34.640 00:10:34.640 real 0m5.411s 00:10:34.640 user 0m8.535s 00:10:34.640 sys 0m0.748s 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:34.640 13:38:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.640 ************************************ 00:10:34.640 END TEST raid_write_error_test 00:10:34.640 ************************************ 00:10:34.640 13:38:49 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:34.640 13:38:49 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:10:34.640 13:38:49 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:34.640 13:38:49 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:34.640 13:38:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:34.640 ************************************ 00:10:34.640 START TEST raid_state_function_test 00:10:34.640 ************************************ 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 false 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1503277 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1503277' 00:10:34.640 Process raid pid: 1503277 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1503277 /var/tmp/spdk-raid.sock 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1503277 ']' 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:34.640 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:34.640 13:38:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.902 [2024-06-10 13:38:49.154146] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:34.902 [2024-06-10 13:38:49.154195] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:34.902 [2024-06-10 13:38:49.241278] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.902 [2024-06-10 13:38:49.305996] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.902 [2024-06-10 13:38:49.346253] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:34.902 [2024-06-10 13:38:49.346276] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:35.844 [2024-06-10 13:38:50.198735] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:35.844 [2024-06-10 13:38:50.198773] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:35.844 [2024-06-10 13:38:50.198779] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:35.844 [2024-06-10 13:38:50.198786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.844 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.104 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:36.104 "name": "Existed_Raid", 00:10:36.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:36.104 "strip_size_kb": 0, 00:10:36.104 "state": "configuring", 00:10:36.105 "raid_level": "raid1", 00:10:36.105 "superblock": false, 00:10:36.105 "num_base_bdevs": 2, 00:10:36.105 "num_base_bdevs_discovered": 0, 00:10:36.105 "num_base_bdevs_operational": 2, 00:10:36.105 "base_bdevs_list": [ 00:10:36.105 { 00:10:36.105 "name": "BaseBdev1", 00:10:36.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:36.105 "is_configured": false, 00:10:36.105 "data_offset": 0, 00:10:36.105 "data_size": 0 00:10:36.105 }, 00:10:36.105 { 00:10:36.105 "name": "BaseBdev2", 00:10:36.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:36.105 "is_configured": false, 00:10:36.105 "data_offset": 0, 00:10:36.105 "data_size": 0 00:10:36.105 } 00:10:36.105 ] 00:10:36.105 }' 00:10:36.105 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:36.105 13:38:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.675 13:38:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:36.675 [2024-06-10 13:38:51.141028] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:36.675 [2024-06-10 13:38:51.141052] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x83e720 name Existed_Raid, state configuring 00:10:36.936 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:36.936 [2024-06-10 13:38:51.341543] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:36.936 [2024-06-10 13:38:51.341565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:36.936 [2024-06-10 13:38:51.341571] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:36.936 [2024-06-10 13:38:51.341577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:36.936 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:37.196 [2024-06-10 13:38:51.549002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:37.196 BaseBdev1 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:37.196 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:37.456 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:37.716 [ 00:10:37.716 { 00:10:37.716 "name": "BaseBdev1", 00:10:37.716 "aliases": [ 00:10:37.716 "280e8116-e556-47ae-a4db-c0823d57f3b7" 00:10:37.716 ], 00:10:37.716 "product_name": "Malloc disk", 00:10:37.716 "block_size": 512, 00:10:37.716 "num_blocks": 65536, 00:10:37.716 "uuid": "280e8116-e556-47ae-a4db-c0823d57f3b7", 00:10:37.716 "assigned_rate_limits": { 00:10:37.716 "rw_ios_per_sec": 0, 00:10:37.716 "rw_mbytes_per_sec": 0, 00:10:37.716 "r_mbytes_per_sec": 0, 00:10:37.716 "w_mbytes_per_sec": 0 00:10:37.716 }, 00:10:37.716 "claimed": true, 00:10:37.716 "claim_type": "exclusive_write", 00:10:37.716 "zoned": false, 00:10:37.716 "supported_io_types": { 00:10:37.716 "read": true, 00:10:37.716 "write": true, 00:10:37.716 "unmap": true, 00:10:37.716 "write_zeroes": true, 00:10:37.716 "flush": true, 00:10:37.716 "reset": true, 00:10:37.716 "compare": false, 00:10:37.716 "compare_and_write": false, 00:10:37.716 "abort": true, 00:10:37.716 "nvme_admin": false, 00:10:37.716 "nvme_io": false 00:10:37.717 }, 00:10:37.717 "memory_domains": [ 00:10:37.717 { 00:10:37.717 "dma_device_id": "system", 00:10:37.717 "dma_device_type": 1 00:10:37.717 }, 00:10:37.717 { 00:10:37.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.717 "dma_device_type": 2 00:10:37.717 } 00:10:37.717 ], 00:10:37.717 "driver_specific": {} 00:10:37.717 } 00:10:37.717 ] 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.717 13:38:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:37.717 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:37.717 "name": "Existed_Raid", 00:10:37.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.717 "strip_size_kb": 0, 00:10:37.717 "state": "configuring", 00:10:37.717 "raid_level": "raid1", 00:10:37.717 "superblock": false, 00:10:37.717 "num_base_bdevs": 2, 00:10:37.717 "num_base_bdevs_discovered": 1, 00:10:37.717 "num_base_bdevs_operational": 2, 00:10:37.717 "base_bdevs_list": [ 00:10:37.717 { 00:10:37.717 "name": "BaseBdev1", 00:10:37.717 "uuid": "280e8116-e556-47ae-a4db-c0823d57f3b7", 00:10:37.717 "is_configured": true, 00:10:37.717 "data_offset": 0, 00:10:37.717 "data_size": 65536 00:10:37.717 }, 00:10:37.717 { 00:10:37.717 "name": "BaseBdev2", 00:10:37.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:37.717 "is_configured": false, 00:10:37.717 "data_offset": 0, 00:10:37.717 "data_size": 0 00:10:37.717 } 00:10:37.717 ] 00:10:37.717 }' 00:10:37.717 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:37.717 13:38:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.289 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:38.549 [2024-06-10 13:38:52.912463] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:38.549 [2024-06-10 13:38:52.912497] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x83e010 name Existed_Raid, state configuring 00:10:38.549 13:38:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:38.810 [2024-06-10 13:38:53.096948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:38.810 [2024-06-10 13:38:53.098213] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:38.810 [2024-06-10 13:38:53.098240] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.810 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:39.070 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:39.070 "name": "Existed_Raid", 00:10:39.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:39.070 "strip_size_kb": 0, 00:10:39.070 "state": "configuring", 00:10:39.070 "raid_level": "raid1", 00:10:39.070 "superblock": false, 00:10:39.070 "num_base_bdevs": 2, 00:10:39.070 "num_base_bdevs_discovered": 1, 00:10:39.070 "num_base_bdevs_operational": 2, 00:10:39.070 "base_bdevs_list": [ 00:10:39.070 { 00:10:39.070 "name": "BaseBdev1", 00:10:39.070 "uuid": "280e8116-e556-47ae-a4db-c0823d57f3b7", 00:10:39.070 "is_configured": true, 00:10:39.070 "data_offset": 0, 00:10:39.070 "data_size": 65536 00:10:39.070 }, 00:10:39.070 { 00:10:39.070 "name": "BaseBdev2", 00:10:39.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:39.070 "is_configured": false, 00:10:39.070 "data_offset": 0, 00:10:39.070 "data_size": 0 00:10:39.070 } 00:10:39.070 ] 00:10:39.070 }' 00:10:39.070 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:39.070 13:38:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.642 13:38:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:39.643 [2024-06-10 13:38:54.004414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:39.643 [2024-06-10 13:38:54.004440] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x83ee00 00:10:39.643 [2024-06-10 13:38:54.004445] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:10:39.643 [2024-06-10 13:38:54.004604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9f24a0 00:10:39.643 [2024-06-10 13:38:54.004703] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x83ee00 00:10:39.643 [2024-06-10 13:38:54.004709] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x83ee00 00:10:39.643 [2024-06-10 13:38:54.004840] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:39.643 BaseBdev2 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:39.643 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:39.903 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:40.164 [ 00:10:40.164 { 00:10:40.164 "name": "BaseBdev2", 00:10:40.164 "aliases": [ 00:10:40.164 "a0ace808-7ec9-4e08-b521-658948f562a0" 00:10:40.164 ], 00:10:40.164 "product_name": "Malloc disk", 00:10:40.164 "block_size": 512, 00:10:40.164 "num_blocks": 65536, 00:10:40.164 "uuid": "a0ace808-7ec9-4e08-b521-658948f562a0", 00:10:40.164 "assigned_rate_limits": { 00:10:40.164 "rw_ios_per_sec": 0, 00:10:40.164 "rw_mbytes_per_sec": 0, 00:10:40.164 "r_mbytes_per_sec": 0, 00:10:40.164 "w_mbytes_per_sec": 0 00:10:40.164 }, 00:10:40.164 "claimed": true, 00:10:40.164 "claim_type": "exclusive_write", 00:10:40.164 "zoned": false, 00:10:40.164 "supported_io_types": { 00:10:40.164 "read": true, 00:10:40.164 "write": true, 00:10:40.164 "unmap": true, 00:10:40.164 "write_zeroes": true, 00:10:40.164 "flush": true, 00:10:40.164 "reset": true, 00:10:40.164 "compare": false, 00:10:40.164 "compare_and_write": false, 00:10:40.164 "abort": true, 00:10:40.164 "nvme_admin": false, 00:10:40.164 "nvme_io": false 00:10:40.164 }, 00:10:40.164 "memory_domains": [ 00:10:40.164 { 00:10:40.164 "dma_device_id": "system", 00:10:40.164 "dma_device_type": 1 00:10:40.164 }, 00:10:40.164 { 00:10:40.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.164 "dma_device_type": 2 00:10:40.164 } 00:10:40.164 ], 00:10:40.164 "driver_specific": {} 00:10:40.164 } 00:10:40.164 ] 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.164 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:40.165 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:40.165 "name": "Existed_Raid", 00:10:40.165 "uuid": "15f63683-b475-445b-ad4f-53d66f2508a4", 00:10:40.165 "strip_size_kb": 0, 00:10:40.165 "state": "online", 00:10:40.165 "raid_level": "raid1", 00:10:40.165 "superblock": false, 00:10:40.165 "num_base_bdevs": 2, 00:10:40.165 "num_base_bdevs_discovered": 2, 00:10:40.165 "num_base_bdevs_operational": 2, 00:10:40.165 "base_bdevs_list": [ 00:10:40.165 { 00:10:40.165 "name": "BaseBdev1", 00:10:40.165 "uuid": "280e8116-e556-47ae-a4db-c0823d57f3b7", 00:10:40.165 "is_configured": true, 00:10:40.165 "data_offset": 0, 00:10:40.165 "data_size": 65536 00:10:40.165 }, 00:10:40.165 { 00:10:40.165 "name": "BaseBdev2", 00:10:40.165 "uuid": "a0ace808-7ec9-4e08-b521-658948f562a0", 00:10:40.165 "is_configured": true, 00:10:40.165 "data_offset": 0, 00:10:40.165 "data_size": 65536 00:10:40.165 } 00:10:40.165 ] 00:10:40.165 }' 00:10:40.165 13:38:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:40.165 13:38:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:40.735 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:40.995 [2024-06-10 13:38:55.352052] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:40.995 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:40.995 "name": "Existed_Raid", 00:10:40.995 "aliases": [ 00:10:40.995 "15f63683-b475-445b-ad4f-53d66f2508a4" 00:10:40.995 ], 00:10:40.995 "product_name": "Raid Volume", 00:10:40.995 "block_size": 512, 00:10:40.995 "num_blocks": 65536, 00:10:40.995 "uuid": "15f63683-b475-445b-ad4f-53d66f2508a4", 00:10:40.995 "assigned_rate_limits": { 00:10:40.995 "rw_ios_per_sec": 0, 00:10:40.995 "rw_mbytes_per_sec": 0, 00:10:40.995 "r_mbytes_per_sec": 0, 00:10:40.995 "w_mbytes_per_sec": 0 00:10:40.995 }, 00:10:40.995 "claimed": false, 00:10:40.995 "zoned": false, 00:10:40.995 "supported_io_types": { 00:10:40.995 "read": true, 00:10:40.995 "write": true, 00:10:40.995 "unmap": false, 00:10:40.995 "write_zeroes": true, 00:10:40.995 "flush": false, 00:10:40.995 "reset": true, 00:10:40.995 "compare": false, 00:10:40.995 "compare_and_write": false, 00:10:40.995 "abort": false, 00:10:40.995 "nvme_admin": false, 00:10:40.995 "nvme_io": false 00:10:40.995 }, 00:10:40.995 "memory_domains": [ 00:10:40.995 { 00:10:40.995 "dma_device_id": "system", 00:10:40.995 "dma_device_type": 1 00:10:40.995 }, 00:10:40.995 { 00:10:40.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.995 "dma_device_type": 2 00:10:40.995 }, 00:10:40.995 { 00:10:40.995 "dma_device_id": "system", 00:10:40.995 "dma_device_type": 1 00:10:40.995 }, 00:10:40.995 { 00:10:40.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.995 "dma_device_type": 2 00:10:40.995 } 00:10:40.995 ], 00:10:40.995 "driver_specific": { 00:10:40.995 "raid": { 00:10:40.995 "uuid": "15f63683-b475-445b-ad4f-53d66f2508a4", 00:10:40.995 "strip_size_kb": 0, 00:10:40.995 "state": "online", 00:10:40.995 "raid_level": "raid1", 00:10:40.995 "superblock": false, 00:10:40.995 "num_base_bdevs": 2, 00:10:40.995 "num_base_bdevs_discovered": 2, 00:10:40.995 "num_base_bdevs_operational": 2, 00:10:40.995 "base_bdevs_list": [ 00:10:40.995 { 00:10:40.995 "name": "BaseBdev1", 00:10:40.995 "uuid": "280e8116-e556-47ae-a4db-c0823d57f3b7", 00:10:40.995 "is_configured": true, 00:10:40.995 "data_offset": 0, 00:10:40.995 "data_size": 65536 00:10:40.995 }, 00:10:40.995 { 00:10:40.995 "name": "BaseBdev2", 00:10:40.995 "uuid": "a0ace808-7ec9-4e08-b521-658948f562a0", 00:10:40.995 "is_configured": true, 00:10:40.996 "data_offset": 0, 00:10:40.996 "data_size": 65536 00:10:40.996 } 00:10:40.996 ] 00:10:40.996 } 00:10:40.996 } 00:10:40.996 }' 00:10:40.996 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:40.996 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:40.996 BaseBdev2' 00:10:40.996 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:40.996 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:40.996 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:41.256 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:41.256 "name": "BaseBdev1", 00:10:41.256 "aliases": [ 00:10:41.256 "280e8116-e556-47ae-a4db-c0823d57f3b7" 00:10:41.256 ], 00:10:41.256 "product_name": "Malloc disk", 00:10:41.256 "block_size": 512, 00:10:41.256 "num_blocks": 65536, 00:10:41.256 "uuid": "280e8116-e556-47ae-a4db-c0823d57f3b7", 00:10:41.256 "assigned_rate_limits": { 00:10:41.256 "rw_ios_per_sec": 0, 00:10:41.256 "rw_mbytes_per_sec": 0, 00:10:41.256 "r_mbytes_per_sec": 0, 00:10:41.256 "w_mbytes_per_sec": 0 00:10:41.256 }, 00:10:41.256 "claimed": true, 00:10:41.256 "claim_type": "exclusive_write", 00:10:41.256 "zoned": false, 00:10:41.256 "supported_io_types": { 00:10:41.256 "read": true, 00:10:41.256 "write": true, 00:10:41.256 "unmap": true, 00:10:41.256 "write_zeroes": true, 00:10:41.256 "flush": true, 00:10:41.256 "reset": true, 00:10:41.256 "compare": false, 00:10:41.256 "compare_and_write": false, 00:10:41.256 "abort": true, 00:10:41.256 "nvme_admin": false, 00:10:41.256 "nvme_io": false 00:10:41.256 }, 00:10:41.256 "memory_domains": [ 00:10:41.256 { 00:10:41.256 "dma_device_id": "system", 00:10:41.256 "dma_device_type": 1 00:10:41.256 }, 00:10:41.256 { 00:10:41.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.256 "dma_device_type": 2 00:10:41.256 } 00:10:41.256 ], 00:10:41.256 "driver_specific": {} 00:10:41.256 }' 00:10:41.256 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:41.256 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:41.256 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:41.256 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:41.517 13:38:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:41.777 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:41.777 "name": "BaseBdev2", 00:10:41.777 "aliases": [ 00:10:41.777 "a0ace808-7ec9-4e08-b521-658948f562a0" 00:10:41.777 ], 00:10:41.777 "product_name": "Malloc disk", 00:10:41.777 "block_size": 512, 00:10:41.777 "num_blocks": 65536, 00:10:41.777 "uuid": "a0ace808-7ec9-4e08-b521-658948f562a0", 00:10:41.777 "assigned_rate_limits": { 00:10:41.777 "rw_ios_per_sec": 0, 00:10:41.777 "rw_mbytes_per_sec": 0, 00:10:41.777 "r_mbytes_per_sec": 0, 00:10:41.777 "w_mbytes_per_sec": 0 00:10:41.777 }, 00:10:41.777 "claimed": true, 00:10:41.777 "claim_type": "exclusive_write", 00:10:41.777 "zoned": false, 00:10:41.777 "supported_io_types": { 00:10:41.777 "read": true, 00:10:41.777 "write": true, 00:10:41.777 "unmap": true, 00:10:41.777 "write_zeroes": true, 00:10:41.777 "flush": true, 00:10:41.777 "reset": true, 00:10:41.777 "compare": false, 00:10:41.777 "compare_and_write": false, 00:10:41.777 "abort": true, 00:10:41.777 "nvme_admin": false, 00:10:41.777 "nvme_io": false 00:10:41.777 }, 00:10:41.777 "memory_domains": [ 00:10:41.777 { 00:10:41.777 "dma_device_id": "system", 00:10:41.777 "dma_device_type": 1 00:10:41.777 }, 00:10:41.777 { 00:10:41.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.777 "dma_device_type": 2 00:10:41.777 } 00:10:41.777 ], 00:10:41.777 "driver_specific": {} 00:10:41.777 }' 00:10:41.777 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:41.777 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:41.777 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:41.777 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:42.039 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:42.299 [2024-06-10 13:38:56.675262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.300 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:42.560 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:42.560 "name": "Existed_Raid", 00:10:42.560 "uuid": "15f63683-b475-445b-ad4f-53d66f2508a4", 00:10:42.560 "strip_size_kb": 0, 00:10:42.560 "state": "online", 00:10:42.560 "raid_level": "raid1", 00:10:42.560 "superblock": false, 00:10:42.560 "num_base_bdevs": 2, 00:10:42.560 "num_base_bdevs_discovered": 1, 00:10:42.560 "num_base_bdevs_operational": 1, 00:10:42.560 "base_bdevs_list": [ 00:10:42.560 { 00:10:42.560 "name": null, 00:10:42.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:42.560 "is_configured": false, 00:10:42.560 "data_offset": 0, 00:10:42.560 "data_size": 65536 00:10:42.560 }, 00:10:42.560 { 00:10:42.560 "name": "BaseBdev2", 00:10:42.560 "uuid": "a0ace808-7ec9-4e08-b521-658948f562a0", 00:10:42.560 "is_configured": true, 00:10:42.560 "data_offset": 0, 00:10:42.560 "data_size": 65536 00:10:42.560 } 00:10:42.560 ] 00:10:42.560 }' 00:10:42.560 13:38:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:42.560 13:38:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:43.129 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:43.129 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:43.129 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.129 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:43.390 [2024-06-10 13:38:57.818174] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:43.390 [2024-06-10 13:38:57.818239] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:43.390 [2024-06-10 13:38:57.824498] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:43.390 [2024-06-10 13:38:57.824523] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:43.390 [2024-06-10 13:38:57.824529] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x83ee00 name Existed_Raid, state offline 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.390 13:38:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1503277 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1503277 ']' 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1503277 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1503277 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1503277' 00:10:43.650 killing process with pid 1503277 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1503277 00:10:43.650 [2024-06-10 13:38:58.096603] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:43.650 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1503277 00:10:43.650 [2024-06-10 13:38:58.097229] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:43.910 13:38:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:43.910 00:10:43.911 real 0m9.126s 00:10:43.911 user 0m16.592s 00:10:43.911 sys 0m1.377s 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:43.911 ************************************ 00:10:43.911 END TEST raid_state_function_test 00:10:43.911 ************************************ 00:10:43.911 13:38:58 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:10:43.911 13:38:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:43.911 13:38:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:43.911 13:38:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:43.911 ************************************ 00:10:43.911 START TEST raid_state_function_test_sb 00:10:43.911 ************************************ 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1505339 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1505339' 00:10:43.911 Process raid pid: 1505339 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1505339 /var/tmp/spdk-raid.sock 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1505339 ']' 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:43.911 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:43.911 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:43.911 [2024-06-10 13:38:58.317978] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:43.911 [2024-06-10 13:38:58.318010] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:44.171 [2024-06-10 13:38:58.395461] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:44.171 [2024-06-10 13:38:58.463367] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.171 [2024-06-10 13:38:58.503444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:44.171 [2024-06-10 13:38:58.503466] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:44.171 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:44.171 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:10:44.171 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:44.432 [2024-06-10 13:38:58.677765] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:44.432 [2024-06-10 13:38:58.677790] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:44.432 [2024-06-10 13:38:58.677796] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:44.432 [2024-06-10 13:38:58.677803] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:44.432 "name": "Existed_Raid", 00:10:44.432 "uuid": "a8ec2c20-047f-4a05-984f-27b5e06d1f40", 00:10:44.432 "strip_size_kb": 0, 00:10:44.432 "state": "configuring", 00:10:44.432 "raid_level": "raid1", 00:10:44.432 "superblock": true, 00:10:44.432 "num_base_bdevs": 2, 00:10:44.432 "num_base_bdevs_discovered": 0, 00:10:44.432 "num_base_bdevs_operational": 2, 00:10:44.432 "base_bdevs_list": [ 00:10:44.432 { 00:10:44.432 "name": "BaseBdev1", 00:10:44.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:44.432 "is_configured": false, 00:10:44.432 "data_offset": 0, 00:10:44.432 "data_size": 0 00:10:44.432 }, 00:10:44.432 { 00:10:44.432 "name": "BaseBdev2", 00:10:44.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:44.432 "is_configured": false, 00:10:44.432 "data_offset": 0, 00:10:44.432 "data_size": 0 00:10:44.432 } 00:10:44.432 ] 00:10:44.432 }' 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:44.432 13:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:45.002 13:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:45.262 [2024-06-10 13:38:59.603998] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:45.262 [2024-06-10 13:38:59.604016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x194f720 name Existed_Raid, state configuring 00:10:45.262 13:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:45.523 [2024-06-10 13:38:59.808533] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:45.523 [2024-06-10 13:38:59.808549] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:45.523 [2024-06-10 13:38:59.808555] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:45.523 [2024-06-10 13:38:59.808561] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:45.523 13:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:45.784 [2024-06-10 13:39:00.012063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:45.784 BaseBdev1 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:45.784 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:46.044 [ 00:10:46.044 { 00:10:46.044 "name": "BaseBdev1", 00:10:46.044 "aliases": [ 00:10:46.044 "6a154431-bd29-4fb9-bad3-5601be956e2e" 00:10:46.044 ], 00:10:46.044 "product_name": "Malloc disk", 00:10:46.044 "block_size": 512, 00:10:46.044 "num_blocks": 65536, 00:10:46.044 "uuid": "6a154431-bd29-4fb9-bad3-5601be956e2e", 00:10:46.044 "assigned_rate_limits": { 00:10:46.044 "rw_ios_per_sec": 0, 00:10:46.044 "rw_mbytes_per_sec": 0, 00:10:46.044 "r_mbytes_per_sec": 0, 00:10:46.044 "w_mbytes_per_sec": 0 00:10:46.044 }, 00:10:46.044 "claimed": true, 00:10:46.044 "claim_type": "exclusive_write", 00:10:46.044 "zoned": false, 00:10:46.044 "supported_io_types": { 00:10:46.044 "read": true, 00:10:46.044 "write": true, 00:10:46.044 "unmap": true, 00:10:46.044 "write_zeroes": true, 00:10:46.044 "flush": true, 00:10:46.044 "reset": true, 00:10:46.044 "compare": false, 00:10:46.044 "compare_and_write": false, 00:10:46.044 "abort": true, 00:10:46.044 "nvme_admin": false, 00:10:46.044 "nvme_io": false 00:10:46.044 }, 00:10:46.044 "memory_domains": [ 00:10:46.044 { 00:10:46.044 "dma_device_id": "system", 00:10:46.044 "dma_device_type": 1 00:10:46.044 }, 00:10:46.044 { 00:10:46.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.044 "dma_device_type": 2 00:10:46.044 } 00:10:46.044 ], 00:10:46.044 "driver_specific": {} 00:10:46.044 } 00:10:46.044 ] 00:10:46.044 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:46.044 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:46.044 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:46.044 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:46.044 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:46.044 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.045 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:46.305 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:46.305 "name": "Existed_Raid", 00:10:46.305 "uuid": "51fb7646-e245-4618-a3cd-66c16e9dc6b7", 00:10:46.305 "strip_size_kb": 0, 00:10:46.305 "state": "configuring", 00:10:46.305 "raid_level": "raid1", 00:10:46.305 "superblock": true, 00:10:46.305 "num_base_bdevs": 2, 00:10:46.305 "num_base_bdevs_discovered": 1, 00:10:46.305 "num_base_bdevs_operational": 2, 00:10:46.305 "base_bdevs_list": [ 00:10:46.305 { 00:10:46.305 "name": "BaseBdev1", 00:10:46.305 "uuid": "6a154431-bd29-4fb9-bad3-5601be956e2e", 00:10:46.305 "is_configured": true, 00:10:46.305 "data_offset": 2048, 00:10:46.305 "data_size": 63488 00:10:46.305 }, 00:10:46.305 { 00:10:46.305 "name": "BaseBdev2", 00:10:46.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.305 "is_configured": false, 00:10:46.305 "data_offset": 0, 00:10:46.305 "data_size": 0 00:10:46.305 } 00:10:46.305 ] 00:10:46.305 }' 00:10:46.306 13:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:46.306 13:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:46.877 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:47.138 [2024-06-10 13:39:01.375529] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:47.138 [2024-06-10 13:39:01.375556] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x194f010 name Existed_Raid, state configuring 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:47.138 [2024-06-10 13:39:01.580077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:47.138 [2024-06-10 13:39:01.581304] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:47.138 [2024-06-10 13:39:01.581329] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.138 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:47.400 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.400 "name": "Existed_Raid", 00:10:47.400 "uuid": "0d53e7bf-65d4-444a-880c-df2f75ca75f2", 00:10:47.400 "strip_size_kb": 0, 00:10:47.400 "state": "configuring", 00:10:47.400 "raid_level": "raid1", 00:10:47.400 "superblock": true, 00:10:47.400 "num_base_bdevs": 2, 00:10:47.400 "num_base_bdevs_discovered": 1, 00:10:47.400 "num_base_bdevs_operational": 2, 00:10:47.400 "base_bdevs_list": [ 00:10:47.400 { 00:10:47.400 "name": "BaseBdev1", 00:10:47.400 "uuid": "6a154431-bd29-4fb9-bad3-5601be956e2e", 00:10:47.400 "is_configured": true, 00:10:47.400 "data_offset": 2048, 00:10:47.400 "data_size": 63488 00:10:47.400 }, 00:10:47.400 { 00:10:47.400 "name": "BaseBdev2", 00:10:47.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.400 "is_configured": false, 00:10:47.400 "data_offset": 0, 00:10:47.400 "data_size": 0 00:10:47.400 } 00:10:47.400 ] 00:10:47.400 }' 00:10:47.400 13:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.400 13:39:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:47.971 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:48.233 [2024-06-10 13:39:02.555550] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:48.233 [2024-06-10 13:39:02.555655] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x194fe00 00:10:48.233 [2024-06-10 13:39:02.555663] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:48.233 [2024-06-10 13:39:02.555811] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1951260 00:10:48.233 [2024-06-10 13:39:02.555905] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x194fe00 00:10:48.233 [2024-06-10 13:39:02.555911] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x194fe00 00:10:48.233 [2024-06-10 13:39:02.555988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:48.233 BaseBdev2 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:48.233 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:48.494 [ 00:10:48.494 { 00:10:48.494 "name": "BaseBdev2", 00:10:48.494 "aliases": [ 00:10:48.494 "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb" 00:10:48.494 ], 00:10:48.494 "product_name": "Malloc disk", 00:10:48.494 "block_size": 512, 00:10:48.494 "num_blocks": 65536, 00:10:48.494 "uuid": "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb", 00:10:48.494 "assigned_rate_limits": { 00:10:48.494 "rw_ios_per_sec": 0, 00:10:48.494 "rw_mbytes_per_sec": 0, 00:10:48.494 "r_mbytes_per_sec": 0, 00:10:48.494 "w_mbytes_per_sec": 0 00:10:48.494 }, 00:10:48.494 "claimed": true, 00:10:48.494 "claim_type": "exclusive_write", 00:10:48.494 "zoned": false, 00:10:48.494 "supported_io_types": { 00:10:48.494 "read": true, 00:10:48.494 "write": true, 00:10:48.494 "unmap": true, 00:10:48.494 "write_zeroes": true, 00:10:48.494 "flush": true, 00:10:48.494 "reset": true, 00:10:48.494 "compare": false, 00:10:48.494 "compare_and_write": false, 00:10:48.494 "abort": true, 00:10:48.494 "nvme_admin": false, 00:10:48.494 "nvme_io": false 00:10:48.494 }, 00:10:48.494 "memory_domains": [ 00:10:48.494 { 00:10:48.494 "dma_device_id": "system", 00:10:48.494 "dma_device_type": 1 00:10:48.494 }, 00:10:48.494 { 00:10:48.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.494 "dma_device_type": 2 00:10:48.494 } 00:10:48.494 ], 00:10:48.494 "driver_specific": {} 00:10:48.494 } 00:10:48.494 ] 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.494 13:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.755 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.755 "name": "Existed_Raid", 00:10:48.755 "uuid": "0d53e7bf-65d4-444a-880c-df2f75ca75f2", 00:10:48.755 "strip_size_kb": 0, 00:10:48.755 "state": "online", 00:10:48.755 "raid_level": "raid1", 00:10:48.755 "superblock": true, 00:10:48.755 "num_base_bdevs": 2, 00:10:48.755 "num_base_bdevs_discovered": 2, 00:10:48.755 "num_base_bdevs_operational": 2, 00:10:48.755 "base_bdevs_list": [ 00:10:48.755 { 00:10:48.755 "name": "BaseBdev1", 00:10:48.755 "uuid": "6a154431-bd29-4fb9-bad3-5601be956e2e", 00:10:48.755 "is_configured": true, 00:10:48.755 "data_offset": 2048, 00:10:48.755 "data_size": 63488 00:10:48.755 }, 00:10:48.755 { 00:10:48.755 "name": "BaseBdev2", 00:10:48.755 "uuid": "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb", 00:10:48.755 "is_configured": true, 00:10:48.755 "data_offset": 2048, 00:10:48.755 "data_size": 63488 00:10:48.755 } 00:10:48.755 ] 00:10:48.755 }' 00:10:48.755 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.755 13:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:49.327 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:49.327 [2024-06-10 13:39:03.794899] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:49.588 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:49.588 "name": "Existed_Raid", 00:10:49.588 "aliases": [ 00:10:49.588 "0d53e7bf-65d4-444a-880c-df2f75ca75f2" 00:10:49.588 ], 00:10:49.588 "product_name": "Raid Volume", 00:10:49.588 "block_size": 512, 00:10:49.588 "num_blocks": 63488, 00:10:49.588 "uuid": "0d53e7bf-65d4-444a-880c-df2f75ca75f2", 00:10:49.588 "assigned_rate_limits": { 00:10:49.588 "rw_ios_per_sec": 0, 00:10:49.588 "rw_mbytes_per_sec": 0, 00:10:49.588 "r_mbytes_per_sec": 0, 00:10:49.588 "w_mbytes_per_sec": 0 00:10:49.588 }, 00:10:49.588 "claimed": false, 00:10:49.588 "zoned": false, 00:10:49.588 "supported_io_types": { 00:10:49.588 "read": true, 00:10:49.588 "write": true, 00:10:49.588 "unmap": false, 00:10:49.588 "write_zeroes": true, 00:10:49.588 "flush": false, 00:10:49.588 "reset": true, 00:10:49.588 "compare": false, 00:10:49.588 "compare_and_write": false, 00:10:49.588 "abort": false, 00:10:49.588 "nvme_admin": false, 00:10:49.588 "nvme_io": false 00:10:49.588 }, 00:10:49.588 "memory_domains": [ 00:10:49.588 { 00:10:49.588 "dma_device_id": "system", 00:10:49.588 "dma_device_type": 1 00:10:49.588 }, 00:10:49.588 { 00:10:49.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.588 "dma_device_type": 2 00:10:49.588 }, 00:10:49.588 { 00:10:49.588 "dma_device_id": "system", 00:10:49.588 "dma_device_type": 1 00:10:49.588 }, 00:10:49.588 { 00:10:49.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.588 "dma_device_type": 2 00:10:49.588 } 00:10:49.588 ], 00:10:49.588 "driver_specific": { 00:10:49.588 "raid": { 00:10:49.588 "uuid": "0d53e7bf-65d4-444a-880c-df2f75ca75f2", 00:10:49.588 "strip_size_kb": 0, 00:10:49.588 "state": "online", 00:10:49.588 "raid_level": "raid1", 00:10:49.588 "superblock": true, 00:10:49.588 "num_base_bdevs": 2, 00:10:49.588 "num_base_bdevs_discovered": 2, 00:10:49.588 "num_base_bdevs_operational": 2, 00:10:49.588 "base_bdevs_list": [ 00:10:49.588 { 00:10:49.588 "name": "BaseBdev1", 00:10:49.588 "uuid": "6a154431-bd29-4fb9-bad3-5601be956e2e", 00:10:49.588 "is_configured": true, 00:10:49.588 "data_offset": 2048, 00:10:49.588 "data_size": 63488 00:10:49.588 }, 00:10:49.588 { 00:10:49.588 "name": "BaseBdev2", 00:10:49.588 "uuid": "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb", 00:10:49.588 "is_configured": true, 00:10:49.588 "data_offset": 2048, 00:10:49.588 "data_size": 63488 00:10:49.588 } 00:10:49.588 ] 00:10:49.588 } 00:10:49.588 } 00:10:49.588 }' 00:10:49.588 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:49.588 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:49.588 BaseBdev2' 00:10:49.588 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:49.588 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:49.588 13:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:49.589 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:49.589 "name": "BaseBdev1", 00:10:49.589 "aliases": [ 00:10:49.589 "6a154431-bd29-4fb9-bad3-5601be956e2e" 00:10:49.589 ], 00:10:49.589 "product_name": "Malloc disk", 00:10:49.589 "block_size": 512, 00:10:49.589 "num_blocks": 65536, 00:10:49.589 "uuid": "6a154431-bd29-4fb9-bad3-5601be956e2e", 00:10:49.589 "assigned_rate_limits": { 00:10:49.589 "rw_ios_per_sec": 0, 00:10:49.589 "rw_mbytes_per_sec": 0, 00:10:49.589 "r_mbytes_per_sec": 0, 00:10:49.589 "w_mbytes_per_sec": 0 00:10:49.589 }, 00:10:49.589 "claimed": true, 00:10:49.589 "claim_type": "exclusive_write", 00:10:49.589 "zoned": false, 00:10:49.589 "supported_io_types": { 00:10:49.589 "read": true, 00:10:49.589 "write": true, 00:10:49.589 "unmap": true, 00:10:49.589 "write_zeroes": true, 00:10:49.589 "flush": true, 00:10:49.589 "reset": true, 00:10:49.589 "compare": false, 00:10:49.589 "compare_and_write": false, 00:10:49.589 "abort": true, 00:10:49.589 "nvme_admin": false, 00:10:49.589 "nvme_io": false 00:10:49.589 }, 00:10:49.589 "memory_domains": [ 00:10:49.589 { 00:10:49.589 "dma_device_id": "system", 00:10:49.589 "dma_device_type": 1 00:10:49.589 }, 00:10:49.589 { 00:10:49.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.589 "dma_device_type": 2 00:10:49.589 } 00:10:49.589 ], 00:10:49.589 "driver_specific": {} 00:10:49.589 }' 00:10:49.849 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:49.850 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.111 "name": "BaseBdev2", 00:10:50.111 "aliases": [ 00:10:50.111 "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb" 00:10:50.111 ], 00:10:50.111 "product_name": "Malloc disk", 00:10:50.111 "block_size": 512, 00:10:50.111 "num_blocks": 65536, 00:10:50.111 "uuid": "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb", 00:10:50.111 "assigned_rate_limits": { 00:10:50.111 "rw_ios_per_sec": 0, 00:10:50.111 "rw_mbytes_per_sec": 0, 00:10:50.111 "r_mbytes_per_sec": 0, 00:10:50.111 "w_mbytes_per_sec": 0 00:10:50.111 }, 00:10:50.111 "claimed": true, 00:10:50.111 "claim_type": "exclusive_write", 00:10:50.111 "zoned": false, 00:10:50.111 "supported_io_types": { 00:10:50.111 "read": true, 00:10:50.111 "write": true, 00:10:50.111 "unmap": true, 00:10:50.111 "write_zeroes": true, 00:10:50.111 "flush": true, 00:10:50.111 "reset": true, 00:10:50.111 "compare": false, 00:10:50.111 "compare_and_write": false, 00:10:50.111 "abort": true, 00:10:50.111 "nvme_admin": false, 00:10:50.111 "nvme_io": false 00:10:50.111 }, 00:10:50.111 "memory_domains": [ 00:10:50.111 { 00:10:50.111 "dma_device_id": "system", 00:10:50.111 "dma_device_type": 1 00:10:50.111 }, 00:10:50.111 { 00:10:50.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.111 "dma_device_type": 2 00:10:50.111 } 00:10:50.111 ], 00:10:50.111 "driver_specific": {} 00:10:50.111 }' 00:10:50.111 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.371 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.632 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.632 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.632 13:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:50.893 [2024-06-10 13:39:05.122103] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:50.893 "name": "Existed_Raid", 00:10:50.893 "uuid": "0d53e7bf-65d4-444a-880c-df2f75ca75f2", 00:10:50.893 "strip_size_kb": 0, 00:10:50.893 "state": "online", 00:10:50.893 "raid_level": "raid1", 00:10:50.893 "superblock": true, 00:10:50.893 "num_base_bdevs": 2, 00:10:50.893 "num_base_bdevs_discovered": 1, 00:10:50.893 "num_base_bdevs_operational": 1, 00:10:50.893 "base_bdevs_list": [ 00:10:50.893 { 00:10:50.893 "name": null, 00:10:50.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.893 "is_configured": false, 00:10:50.893 "data_offset": 2048, 00:10:50.893 "data_size": 63488 00:10:50.893 }, 00:10:50.893 { 00:10:50.893 "name": "BaseBdev2", 00:10:50.893 "uuid": "2e0b6eca-3439-44bd-9dc4-93da9dcf95cb", 00:10:50.893 "is_configured": true, 00:10:50.893 "data_offset": 2048, 00:10:50.893 "data_size": 63488 00:10:50.893 } 00:10:50.893 ] 00:10:50.893 }' 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:50.893 13:39:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:51.465 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:51.465 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:51.465 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.465 13:39:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:51.725 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:51.725 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:51.725 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:51.986 [2024-06-10 13:39:06.297096] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:51.986 [2024-06-10 13:39:06.297160] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:51.986 [2024-06-10 13:39:06.303345] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:51.986 [2024-06-10 13:39:06.303370] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:51.986 [2024-06-10 13:39:06.303376] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x194fe00 name Existed_Raid, state offline 00:10:51.986 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:51.986 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:51.986 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.986 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1505339 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1505339 ']' 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1505339 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1505339 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1505339' 00:10:52.247 killing process with pid 1505339 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1505339 00:10:52.247 [2024-06-10 13:39:06.574980] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1505339 00:10:52.247 [2024-06-10 13:39:06.575595] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:52.247 00:10:52.247 real 0m8.411s 00:10:52.247 user 0m15.692s 00:10:52.247 sys 0m1.316s 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:52.247 13:39:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:52.247 ************************************ 00:10:52.247 END TEST raid_state_function_test_sb 00:10:52.247 ************************************ 00:10:52.508 13:39:06 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:10:52.508 13:39:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:10:52.508 13:39:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:52.508 13:39:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:52.508 ************************************ 00:10:52.508 START TEST raid_superblock_test 00:10:52.508 ************************************ 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1507327 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1507327 /var/tmp/spdk-raid.sock 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1507327 ']' 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:52.508 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:52.508 13:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.508 [2024-06-10 13:39:06.834717] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:10:52.508 [2024-06-10 13:39:06.834769] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1507327 ] 00:10:52.508 [2024-06-10 13:39:06.926947] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.769 [2024-06-10 13:39:06.996920] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:10:52.769 [2024-06-10 13:39:07.039195] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:52.769 [2024-06-10 13:39:07.039219] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:53.340 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:53.600 malloc1 00:10:53.600 13:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:53.860 [2024-06-10 13:39:08.082548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:53.860 [2024-06-10 13:39:08.082583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:53.860 [2024-06-10 13:39:08.082597] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x212a550 00:10:53.860 [2024-06-10 13:39:08.082604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:53.860 [2024-06-10 13:39:08.083939] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:53.860 [2024-06-10 13:39:08.083959] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:53.860 pt1 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:53.860 malloc2 00:10:53.860 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:54.120 [2024-06-10 13:39:08.485691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:54.120 [2024-06-10 13:39:08.485721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.120 [2024-06-10 13:39:08.485730] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21ec0f0 00:10:54.120 [2024-06-10 13:39:08.485737] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.121 [2024-06-10 13:39:08.486988] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.121 [2024-06-10 13:39:08.487006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:54.121 pt2 00:10:54.121 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:54.121 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:54.121 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:54.382 [2024-06-10 13:39:08.674176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:54.382 [2024-06-10 13:39:08.675229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:54.382 [2024-06-10 13:39:08.675345] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21f6690 00:10:54.382 [2024-06-10 13:39:08.675353] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:54.382 [2024-06-10 13:39:08.675515] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21220c0 00:10:54.382 [2024-06-10 13:39:08.675629] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21f6690 00:10:54.382 [2024-06-10 13:39:08.675638] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21f6690 00:10:54.382 [2024-06-10 13:39:08.675714] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.382 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:54.643 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.643 "name": "raid_bdev1", 00:10:54.643 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:10:54.643 "strip_size_kb": 0, 00:10:54.643 "state": "online", 00:10:54.643 "raid_level": "raid1", 00:10:54.643 "superblock": true, 00:10:54.643 "num_base_bdevs": 2, 00:10:54.643 "num_base_bdevs_discovered": 2, 00:10:54.643 "num_base_bdevs_operational": 2, 00:10:54.643 "base_bdevs_list": [ 00:10:54.643 { 00:10:54.643 "name": "pt1", 00:10:54.643 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:54.643 "is_configured": true, 00:10:54.643 "data_offset": 2048, 00:10:54.643 "data_size": 63488 00:10:54.643 }, 00:10:54.643 { 00:10:54.643 "name": "pt2", 00:10:54.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:54.643 "is_configured": true, 00:10:54.643 "data_offset": 2048, 00:10:54.643 "data_size": 63488 00:10:54.643 } 00:10:54.643 ] 00:10:54.643 }' 00:10:54.643 13:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.643 13:39:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:55.214 [2024-06-10 13:39:09.632780] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:55.214 "name": "raid_bdev1", 00:10:55.214 "aliases": [ 00:10:55.214 "f5a6f65d-aca3-4be4-a408-ba90a757855d" 00:10:55.214 ], 00:10:55.214 "product_name": "Raid Volume", 00:10:55.214 "block_size": 512, 00:10:55.214 "num_blocks": 63488, 00:10:55.214 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:10:55.214 "assigned_rate_limits": { 00:10:55.214 "rw_ios_per_sec": 0, 00:10:55.214 "rw_mbytes_per_sec": 0, 00:10:55.214 "r_mbytes_per_sec": 0, 00:10:55.214 "w_mbytes_per_sec": 0 00:10:55.214 }, 00:10:55.214 "claimed": false, 00:10:55.214 "zoned": false, 00:10:55.214 "supported_io_types": { 00:10:55.214 "read": true, 00:10:55.214 "write": true, 00:10:55.214 "unmap": false, 00:10:55.214 "write_zeroes": true, 00:10:55.214 "flush": false, 00:10:55.214 "reset": true, 00:10:55.214 "compare": false, 00:10:55.214 "compare_and_write": false, 00:10:55.214 "abort": false, 00:10:55.214 "nvme_admin": false, 00:10:55.214 "nvme_io": false 00:10:55.214 }, 00:10:55.214 "memory_domains": [ 00:10:55.214 { 00:10:55.214 "dma_device_id": "system", 00:10:55.214 "dma_device_type": 1 00:10:55.214 }, 00:10:55.214 { 00:10:55.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.214 "dma_device_type": 2 00:10:55.214 }, 00:10:55.214 { 00:10:55.214 "dma_device_id": "system", 00:10:55.214 "dma_device_type": 1 00:10:55.214 }, 00:10:55.214 { 00:10:55.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.214 "dma_device_type": 2 00:10:55.214 } 00:10:55.214 ], 00:10:55.214 "driver_specific": { 00:10:55.214 "raid": { 00:10:55.214 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:10:55.214 "strip_size_kb": 0, 00:10:55.214 "state": "online", 00:10:55.214 "raid_level": "raid1", 00:10:55.214 "superblock": true, 00:10:55.214 "num_base_bdevs": 2, 00:10:55.214 "num_base_bdevs_discovered": 2, 00:10:55.214 "num_base_bdevs_operational": 2, 00:10:55.214 "base_bdevs_list": [ 00:10:55.214 { 00:10:55.214 "name": "pt1", 00:10:55.214 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:55.214 "is_configured": true, 00:10:55.214 "data_offset": 2048, 00:10:55.214 "data_size": 63488 00:10:55.214 }, 00:10:55.214 { 00:10:55.214 "name": "pt2", 00:10:55.214 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:55.214 "is_configured": true, 00:10:55.214 "data_offset": 2048, 00:10:55.214 "data_size": 63488 00:10:55.214 } 00:10:55.214 ] 00:10:55.214 } 00:10:55.214 } 00:10:55.214 }' 00:10:55.214 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:55.475 pt2' 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.475 "name": "pt1", 00:10:55.475 "aliases": [ 00:10:55.475 "00000000-0000-0000-0000-000000000001" 00:10:55.475 ], 00:10:55.475 "product_name": "passthru", 00:10:55.475 "block_size": 512, 00:10:55.475 "num_blocks": 65536, 00:10:55.475 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:55.475 "assigned_rate_limits": { 00:10:55.475 "rw_ios_per_sec": 0, 00:10:55.475 "rw_mbytes_per_sec": 0, 00:10:55.475 "r_mbytes_per_sec": 0, 00:10:55.475 "w_mbytes_per_sec": 0 00:10:55.475 }, 00:10:55.475 "claimed": true, 00:10:55.475 "claim_type": "exclusive_write", 00:10:55.475 "zoned": false, 00:10:55.475 "supported_io_types": { 00:10:55.475 "read": true, 00:10:55.475 "write": true, 00:10:55.475 "unmap": true, 00:10:55.475 "write_zeroes": true, 00:10:55.475 "flush": true, 00:10:55.475 "reset": true, 00:10:55.475 "compare": false, 00:10:55.475 "compare_and_write": false, 00:10:55.475 "abort": true, 00:10:55.475 "nvme_admin": false, 00:10:55.475 "nvme_io": false 00:10:55.475 }, 00:10:55.475 "memory_domains": [ 00:10:55.475 { 00:10:55.475 "dma_device_id": "system", 00:10:55.475 "dma_device_type": 1 00:10:55.475 }, 00:10:55.475 { 00:10:55.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.475 "dma_device_type": 2 00:10:55.475 } 00:10:55.475 ], 00:10:55.475 "driver_specific": { 00:10:55.475 "passthru": { 00:10:55.475 "name": "pt1", 00:10:55.475 "base_bdev_name": "malloc1" 00:10:55.475 } 00:10:55.475 } 00:10:55.475 }' 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.475 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:55.736 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:55.736 13:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.736 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:55.997 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:55.997 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:55.997 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:55.997 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:55.997 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:55.997 "name": "pt2", 00:10:55.997 "aliases": [ 00:10:55.997 "00000000-0000-0000-0000-000000000002" 00:10:55.997 ], 00:10:55.997 "product_name": "passthru", 00:10:55.997 "block_size": 512, 00:10:55.997 "num_blocks": 65536, 00:10:55.997 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:55.997 "assigned_rate_limits": { 00:10:55.997 "rw_ios_per_sec": 0, 00:10:55.997 "rw_mbytes_per_sec": 0, 00:10:55.997 "r_mbytes_per_sec": 0, 00:10:55.997 "w_mbytes_per_sec": 0 00:10:55.997 }, 00:10:55.997 "claimed": true, 00:10:55.997 "claim_type": "exclusive_write", 00:10:55.997 "zoned": false, 00:10:55.997 "supported_io_types": { 00:10:55.997 "read": true, 00:10:55.997 "write": true, 00:10:55.997 "unmap": true, 00:10:55.997 "write_zeroes": true, 00:10:55.997 "flush": true, 00:10:55.997 "reset": true, 00:10:55.997 "compare": false, 00:10:55.997 "compare_and_write": false, 00:10:55.997 "abort": true, 00:10:55.997 "nvme_admin": false, 00:10:55.997 "nvme_io": false 00:10:55.997 }, 00:10:55.997 "memory_domains": [ 00:10:55.997 { 00:10:55.997 "dma_device_id": "system", 00:10:55.997 "dma_device_type": 1 00:10:55.997 }, 00:10:55.997 { 00:10:55.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.997 "dma_device_type": 2 00:10:55.997 } 00:10:55.997 ], 00:10:55.997 "driver_specific": { 00:10:55.997 "passthru": { 00:10:55.997 "name": "pt2", 00:10:55.997 "base_bdev_name": "malloc2" 00:10:55.997 } 00:10:55.997 } 00:10:55.997 }' 00:10:55.997 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:56.257 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.517 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:56.517 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:56.517 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:56.517 13:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:56.778 [2024-06-10 13:39:10.996250] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.778 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f5a6f65d-aca3-4be4-a408-ba90a757855d 00:10:56.778 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z f5a6f65d-aca3-4be4-a408-ba90a757855d ']' 00:10:56.778 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:56.778 [2024-06-10 13:39:11.200584] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:56.778 [2024-06-10 13:39:11.200595] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:56.778 [2024-06-10 13:39:11.200635] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:56.778 [2024-06-10 13:39:11.200676] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:56.778 [2024-06-10 13:39:11.200682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21f6690 name raid_bdev1, state offline 00:10:56.778 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.778 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:57.038 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:57.038 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:57.038 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:57.038 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:57.298 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:57.298 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:57.579 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:57.580 13:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:57.580 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:57.953 [2024-06-10 13:39:12.203087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:57.953 [2024-06-10 13:39:12.204221] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:57.953 [2024-06-10 13:39:12.204264] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:57.953 [2024-06-10 13:39:12.204291] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:57.953 [2024-06-10 13:39:12.204302] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:57.953 [2024-06-10 13:39:12.204307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21edec0 name raid_bdev1, state configuring 00:10:57.953 request: 00:10:57.953 { 00:10:57.953 "name": "raid_bdev1", 00:10:57.953 "raid_level": "raid1", 00:10:57.953 "base_bdevs": [ 00:10:57.953 "malloc1", 00:10:57.953 "malloc2" 00:10:57.953 ], 00:10:57.953 "superblock": false, 00:10:57.953 "method": "bdev_raid_create", 00:10:57.953 "req_id": 1 00:10:57.953 } 00:10:57.953 Got JSON-RPC error response 00:10:57.953 response: 00:10:57.953 { 00:10:57.953 "code": -17, 00:10:57.953 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:57.953 } 00:10:57.953 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:10:57.953 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:10:57.953 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:10:57.953 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:10:57.953 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.953 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:58.213 [2024-06-10 13:39:12.596043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:58.213 [2024-06-10 13:39:12.596063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.213 [2024-06-10 13:39:12.596073] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x212d7c0 00:10:58.213 [2024-06-10 13:39:12.596079] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.213 [2024-06-10 13:39:12.597403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.213 [2024-06-10 13:39:12.597422] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:58.213 [2024-06-10 13:39:12.597469] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:58.213 [2024-06-10 13:39:12.597487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:58.213 pt1 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:58.213 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.473 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.473 "name": "raid_bdev1", 00:10:58.473 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:10:58.473 "strip_size_kb": 0, 00:10:58.473 "state": "configuring", 00:10:58.473 "raid_level": "raid1", 00:10:58.473 "superblock": true, 00:10:58.473 "num_base_bdevs": 2, 00:10:58.473 "num_base_bdevs_discovered": 1, 00:10:58.473 "num_base_bdevs_operational": 2, 00:10:58.473 "base_bdevs_list": [ 00:10:58.473 { 00:10:58.473 "name": "pt1", 00:10:58.473 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:58.473 "is_configured": true, 00:10:58.473 "data_offset": 2048, 00:10:58.473 "data_size": 63488 00:10:58.473 }, 00:10:58.473 { 00:10:58.473 "name": null, 00:10:58.473 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:58.473 "is_configured": false, 00:10:58.473 "data_offset": 2048, 00:10:58.473 "data_size": 63488 00:10:58.473 } 00:10:58.473 ] 00:10:58.473 }' 00:10:58.473 13:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.473 13:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.045 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:59.045 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:59.045 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:59.045 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:59.305 [2024-06-10 13:39:13.566514] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:59.305 [2024-06-10 13:39:13.566540] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.305 [2024-06-10 13:39:13.566550] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21eda40 00:10:59.305 [2024-06-10 13:39:13.566557] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.305 [2024-06-10 13:39:13.566824] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.305 [2024-06-10 13:39:13.566835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:59.305 [2024-06-10 13:39:13.566875] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:59.305 [2024-06-10 13:39:13.566887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:59.305 [2024-06-10 13:39:13.566962] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x212cf30 00:10:59.305 [2024-06-10 13:39:13.566969] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:59.305 [2024-06-10 13:39:13.567113] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212bb00 00:10:59.305 [2024-06-10 13:39:13.567231] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x212cf30 00:10:59.305 [2024-06-10 13:39:13.567238] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x212cf30 00:10:59.305 [2024-06-10 13:39:13.567316] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:59.305 pt2 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.305 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:59.565 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.565 "name": "raid_bdev1", 00:10:59.565 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:10:59.565 "strip_size_kb": 0, 00:10:59.565 "state": "online", 00:10:59.565 "raid_level": "raid1", 00:10:59.565 "superblock": true, 00:10:59.565 "num_base_bdevs": 2, 00:10:59.565 "num_base_bdevs_discovered": 2, 00:10:59.565 "num_base_bdevs_operational": 2, 00:10:59.565 "base_bdevs_list": [ 00:10:59.565 { 00:10:59.565 "name": "pt1", 00:10:59.565 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:59.565 "is_configured": true, 00:10:59.565 "data_offset": 2048, 00:10:59.565 "data_size": 63488 00:10:59.565 }, 00:10:59.565 { 00:10:59.565 "name": "pt2", 00:10:59.565 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:59.565 "is_configured": true, 00:10:59.565 "data_offset": 2048, 00:10:59.565 "data_size": 63488 00:10:59.565 } 00:10:59.565 ] 00:10:59.565 }' 00:10:59.565 13:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.565 13:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:00.137 [2024-06-10 13:39:14.529142] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:00.137 "name": "raid_bdev1", 00:11:00.137 "aliases": [ 00:11:00.137 "f5a6f65d-aca3-4be4-a408-ba90a757855d" 00:11:00.137 ], 00:11:00.137 "product_name": "Raid Volume", 00:11:00.137 "block_size": 512, 00:11:00.137 "num_blocks": 63488, 00:11:00.137 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:11:00.137 "assigned_rate_limits": { 00:11:00.137 "rw_ios_per_sec": 0, 00:11:00.137 "rw_mbytes_per_sec": 0, 00:11:00.137 "r_mbytes_per_sec": 0, 00:11:00.137 "w_mbytes_per_sec": 0 00:11:00.137 }, 00:11:00.137 "claimed": false, 00:11:00.137 "zoned": false, 00:11:00.137 "supported_io_types": { 00:11:00.137 "read": true, 00:11:00.137 "write": true, 00:11:00.137 "unmap": false, 00:11:00.137 "write_zeroes": true, 00:11:00.137 "flush": false, 00:11:00.137 "reset": true, 00:11:00.137 "compare": false, 00:11:00.137 "compare_and_write": false, 00:11:00.137 "abort": false, 00:11:00.137 "nvme_admin": false, 00:11:00.137 "nvme_io": false 00:11:00.137 }, 00:11:00.137 "memory_domains": [ 00:11:00.137 { 00:11:00.137 "dma_device_id": "system", 00:11:00.137 "dma_device_type": 1 00:11:00.137 }, 00:11:00.137 { 00:11:00.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.137 "dma_device_type": 2 00:11:00.137 }, 00:11:00.137 { 00:11:00.137 "dma_device_id": "system", 00:11:00.137 "dma_device_type": 1 00:11:00.137 }, 00:11:00.137 { 00:11:00.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.137 "dma_device_type": 2 00:11:00.137 } 00:11:00.137 ], 00:11:00.137 "driver_specific": { 00:11:00.137 "raid": { 00:11:00.137 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:11:00.137 "strip_size_kb": 0, 00:11:00.137 "state": "online", 00:11:00.137 "raid_level": "raid1", 00:11:00.137 "superblock": true, 00:11:00.137 "num_base_bdevs": 2, 00:11:00.137 "num_base_bdevs_discovered": 2, 00:11:00.137 "num_base_bdevs_operational": 2, 00:11:00.137 "base_bdevs_list": [ 00:11:00.137 { 00:11:00.137 "name": "pt1", 00:11:00.137 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:00.137 "is_configured": true, 00:11:00.137 "data_offset": 2048, 00:11:00.137 "data_size": 63488 00:11:00.137 }, 00:11:00.137 { 00:11:00.137 "name": "pt2", 00:11:00.137 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:00.137 "is_configured": true, 00:11:00.137 "data_offset": 2048, 00:11:00.137 "data_size": 63488 00:11:00.137 } 00:11:00.137 ] 00:11:00.137 } 00:11:00.137 } 00:11:00.137 }' 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:00.137 pt2' 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:00.137 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:00.399 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:00.399 "name": "pt1", 00:11:00.399 "aliases": [ 00:11:00.399 "00000000-0000-0000-0000-000000000001" 00:11:00.399 ], 00:11:00.399 "product_name": "passthru", 00:11:00.399 "block_size": 512, 00:11:00.399 "num_blocks": 65536, 00:11:00.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:00.399 "assigned_rate_limits": { 00:11:00.399 "rw_ios_per_sec": 0, 00:11:00.399 "rw_mbytes_per_sec": 0, 00:11:00.399 "r_mbytes_per_sec": 0, 00:11:00.399 "w_mbytes_per_sec": 0 00:11:00.399 }, 00:11:00.399 "claimed": true, 00:11:00.399 "claim_type": "exclusive_write", 00:11:00.399 "zoned": false, 00:11:00.399 "supported_io_types": { 00:11:00.399 "read": true, 00:11:00.399 "write": true, 00:11:00.399 "unmap": true, 00:11:00.399 "write_zeroes": true, 00:11:00.399 "flush": true, 00:11:00.399 "reset": true, 00:11:00.399 "compare": false, 00:11:00.399 "compare_and_write": false, 00:11:00.399 "abort": true, 00:11:00.399 "nvme_admin": false, 00:11:00.399 "nvme_io": false 00:11:00.399 }, 00:11:00.399 "memory_domains": [ 00:11:00.399 { 00:11:00.399 "dma_device_id": "system", 00:11:00.399 "dma_device_type": 1 00:11:00.399 }, 00:11:00.399 { 00:11:00.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.399 "dma_device_type": 2 00:11:00.399 } 00:11:00.399 ], 00:11:00.399 "driver_specific": { 00:11:00.399 "passthru": { 00:11:00.399 "name": "pt1", 00:11:00.399 "base_bdev_name": "malloc1" 00:11:00.399 } 00:11:00.399 } 00:11:00.399 }' 00:11:00.399 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.399 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.399 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:00.399 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.661 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:00.661 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:00.661 13:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:00.661 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:00.921 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:00.921 "name": "pt2", 00:11:00.921 "aliases": [ 00:11:00.921 "00000000-0000-0000-0000-000000000002" 00:11:00.921 ], 00:11:00.921 "product_name": "passthru", 00:11:00.921 "block_size": 512, 00:11:00.921 "num_blocks": 65536, 00:11:00.921 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:00.921 "assigned_rate_limits": { 00:11:00.921 "rw_ios_per_sec": 0, 00:11:00.921 "rw_mbytes_per_sec": 0, 00:11:00.921 "r_mbytes_per_sec": 0, 00:11:00.921 "w_mbytes_per_sec": 0 00:11:00.921 }, 00:11:00.921 "claimed": true, 00:11:00.921 "claim_type": "exclusive_write", 00:11:00.921 "zoned": false, 00:11:00.921 "supported_io_types": { 00:11:00.921 "read": true, 00:11:00.921 "write": true, 00:11:00.921 "unmap": true, 00:11:00.921 "write_zeroes": true, 00:11:00.921 "flush": true, 00:11:00.921 "reset": true, 00:11:00.921 "compare": false, 00:11:00.921 "compare_and_write": false, 00:11:00.921 "abort": true, 00:11:00.921 "nvme_admin": false, 00:11:00.921 "nvme_io": false 00:11:00.921 }, 00:11:00.921 "memory_domains": [ 00:11:00.921 { 00:11:00.921 "dma_device_id": "system", 00:11:00.921 "dma_device_type": 1 00:11:00.921 }, 00:11:00.921 { 00:11:00.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.921 "dma_device_type": 2 00:11:00.921 } 00:11:00.921 ], 00:11:00.921 "driver_specific": { 00:11:00.921 "passthru": { 00:11:00.921 "name": "pt2", 00:11:00.921 "base_bdev_name": "malloc2" 00:11:00.921 } 00:11:00.921 } 00:11:00.921 }' 00:11:00.921 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.921 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:00.921 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:00.921 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:01.181 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:01.441 [2024-06-10 13:39:15.832455] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:01.441 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' f5a6f65d-aca3-4be4-a408-ba90a757855d '!=' f5a6f65d-aca3-4be4-a408-ba90a757855d ']' 00:11:01.441 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:11:01.441 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:01.441 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:01.441 13:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:01.701 [2024-06-10 13:39:16.036820] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:01.701 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.961 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.961 "name": "raid_bdev1", 00:11:01.961 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:11:01.961 "strip_size_kb": 0, 00:11:01.961 "state": "online", 00:11:01.961 "raid_level": "raid1", 00:11:01.961 "superblock": true, 00:11:01.961 "num_base_bdevs": 2, 00:11:01.961 "num_base_bdevs_discovered": 1, 00:11:01.961 "num_base_bdevs_operational": 1, 00:11:01.961 "base_bdevs_list": [ 00:11:01.961 { 00:11:01.961 "name": null, 00:11:01.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.961 "is_configured": false, 00:11:01.961 "data_offset": 2048, 00:11:01.961 "data_size": 63488 00:11:01.961 }, 00:11:01.961 { 00:11:01.961 "name": "pt2", 00:11:01.961 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:01.961 "is_configured": true, 00:11:01.961 "data_offset": 2048, 00:11:01.961 "data_size": 63488 00:11:01.961 } 00:11:01.961 ] 00:11:01.961 }' 00:11:01.961 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.961 13:39:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.530 13:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:02.790 [2024-06-10 13:39:17.011283] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:02.790 [2024-06-10 13:39:17.011297] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:02.790 [2024-06-10 13:39:17.011328] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:02.790 [2024-06-10 13:39:17.011355] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:02.790 [2024-06-10 13:39:17.011361] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x212cf30 name raid_bdev1, state offline 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:02.790 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:03.050 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:11:03.050 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:03.050 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:11:03.050 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:11:03.050 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:11:03.050 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:03.310 [2024-06-10 13:39:17.616795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:03.310 [2024-06-10 13:39:17.616820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:03.310 [2024-06-10 13:39:17.616831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2123f20 00:11:03.310 [2024-06-10 13:39:17.616838] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:03.310 [2024-06-10 13:39:17.618193] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:03.310 [2024-06-10 13:39:17.618213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:03.310 [2024-06-10 13:39:17.618259] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:03.310 [2024-06-10 13:39:17.618277] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:03.310 [2024-06-10 13:39:17.618338] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2122770 00:11:03.310 [2024-06-10 13:39:17.618344] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:03.310 [2024-06-10 13:39:17.618495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212a220 00:11:03.310 [2024-06-10 13:39:17.618594] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2122770 00:11:03.310 [2024-06-10 13:39:17.618599] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2122770 00:11:03.310 [2024-06-10 13:39:17.618674] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.310 pt2 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.310 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:03.570 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.570 "name": "raid_bdev1", 00:11:03.570 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:11:03.570 "strip_size_kb": 0, 00:11:03.570 "state": "online", 00:11:03.570 "raid_level": "raid1", 00:11:03.570 "superblock": true, 00:11:03.570 "num_base_bdevs": 2, 00:11:03.570 "num_base_bdevs_discovered": 1, 00:11:03.570 "num_base_bdevs_operational": 1, 00:11:03.570 "base_bdevs_list": [ 00:11:03.570 { 00:11:03.570 "name": null, 00:11:03.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.570 "is_configured": false, 00:11:03.570 "data_offset": 2048, 00:11:03.570 "data_size": 63488 00:11:03.570 }, 00:11:03.570 { 00:11:03.570 "name": "pt2", 00:11:03.570 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:03.570 "is_configured": true, 00:11:03.570 "data_offset": 2048, 00:11:03.570 "data_size": 63488 00:11:03.570 } 00:11:03.570 ] 00:11:03.570 }' 00:11:03.570 13:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.570 13:39:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.139 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:04.139 [2024-06-10 13:39:18.491000] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:04.140 [2024-06-10 13:39:18.491013] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:04.140 [2024-06-10 13:39:18.491044] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:04.140 [2024-06-10 13:39:18.491071] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:04.140 [2024-06-10 13:39:18.491077] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2122770 name raid_bdev1, state offline 00:11:04.140 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.140 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:11:04.400 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:11:04.400 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:11:04.400 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:11:04.400 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:04.660 [2024-06-10 13:39:18.892008] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:04.660 [2024-06-10 13:39:18.892033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:04.660 [2024-06-10 13:39:18.892042] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x212aa10 00:11:04.660 [2024-06-10 13:39:18.892049] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:04.660 [2024-06-10 13:39:18.893369] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:04.660 [2024-06-10 13:39:18.893387] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:04.660 [2024-06-10 13:39:18.893431] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:04.660 [2024-06-10 13:39:18.893446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:04.660 [2024-06-10 13:39:18.893519] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:11:04.660 [2024-06-10 13:39:18.893527] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:04.660 [2024-06-10 13:39:18.893534] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2120ea0 name raid_bdev1, state configuring 00:11:04.660 [2024-06-10 13:39:18.893549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:04.660 [2024-06-10 13:39:18.893590] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2120ea0 00:11:04.660 [2024-06-10 13:39:18.893596] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:04.660 [2024-06-10 13:39:18.893747] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212bb00 00:11:04.660 [2024-06-10 13:39:18.893846] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2120ea0 00:11:04.660 [2024-06-10 13:39:18.893851] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2120ea0 00:11:04.660 [2024-06-10 13:39:18.893933] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.660 pt1 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.660 13:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:04.660 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.660 "name": "raid_bdev1", 00:11:04.660 "uuid": "f5a6f65d-aca3-4be4-a408-ba90a757855d", 00:11:04.660 "strip_size_kb": 0, 00:11:04.660 "state": "online", 00:11:04.660 "raid_level": "raid1", 00:11:04.660 "superblock": true, 00:11:04.660 "num_base_bdevs": 2, 00:11:04.660 "num_base_bdevs_discovered": 1, 00:11:04.660 "num_base_bdevs_operational": 1, 00:11:04.660 "base_bdevs_list": [ 00:11:04.660 { 00:11:04.660 "name": null, 00:11:04.660 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.660 "is_configured": false, 00:11:04.660 "data_offset": 2048, 00:11:04.660 "data_size": 63488 00:11:04.660 }, 00:11:04.660 { 00:11:04.660 "name": "pt2", 00:11:04.660 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:04.660 "is_configured": true, 00:11:04.660 "data_offset": 2048, 00:11:04.660 "data_size": 63488 00:11:04.660 } 00:11:04.660 ] 00:11:04.660 }' 00:11:04.660 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.660 13:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.231 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:05.231 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:11:05.491 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:11:05.491 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:05.491 13:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:11:05.752 [2024-06-10 13:39:20.035073] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' f5a6f65d-aca3-4be4-a408-ba90a757855d '!=' f5a6f65d-aca3-4be4-a408-ba90a757855d ']' 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1507327 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1507327 ']' 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1507327 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1507327 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1507327' 00:11:05.752 killing process with pid 1507327 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1507327 00:11:05.752 [2024-06-10 13:39:20.102789] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:05.752 [2024-06-10 13:39:20.102832] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:05.752 [2024-06-10 13:39:20.102866] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:05.752 [2024-06-10 13:39:20.102872] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2120ea0 name raid_bdev1, state offline 00:11:05.752 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1507327 00:11:05.752 [2024-06-10 13:39:20.112285] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:06.012 13:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:06.012 00:11:06.012 real 0m13.457s 00:11:06.012 user 0m24.929s 00:11:06.012 sys 0m2.032s 00:11:06.012 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:06.012 13:39:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.012 ************************************ 00:11:06.012 END TEST raid_superblock_test 00:11:06.012 ************************************ 00:11:06.012 13:39:20 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:11:06.012 13:39:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:06.012 13:39:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:06.012 13:39:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:06.012 ************************************ 00:11:06.012 START TEST raid_read_error_test 00:11:06.012 ************************************ 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 read 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.KYQRY0QiZF 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1510738 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1510738 /var/tmp/spdk-raid.sock 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1510738 ']' 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:06.012 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:06.012 13:39:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.012 [2024-06-10 13:39:20.378360] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:11:06.012 [2024-06-10 13:39:20.378405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1510738 ] 00:11:06.012 [2024-06-10 13:39:20.465809] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:06.272 [2024-06-10 13:39:20.531078] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:11:06.272 [2024-06-10 13:39:20.570188] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:06.272 [2024-06-10 13:39:20.570211] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:06.842 13:39:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:06.842 13:39:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:11:06.842 13:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:06.842 13:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:07.102 BaseBdev1_malloc 00:11:07.102 13:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:07.361 true 00:11:07.361 13:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:07.361 [2024-06-10 13:39:21.817885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:07.361 [2024-06-10 13:39:21.817916] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:07.361 [2024-06-10 13:39:21.817928] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd61c90 00:11:07.361 [2024-06-10 13:39:21.817935] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:07.361 [2024-06-10 13:39:21.819388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:07.361 [2024-06-10 13:39:21.819409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:07.361 BaseBdev1 00:11:07.361 13:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:07.361 13:39:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:07.621 BaseBdev2_malloc 00:11:07.621 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:07.879 true 00:11:07.879 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:08.139 [2024-06-10 13:39:22.397394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:08.139 [2024-06-10 13:39:22.397421] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:08.139 [2024-06-10 13:39:22.397432] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd66400 00:11:08.139 [2024-06-10 13:39:22.397438] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:08.139 [2024-06-10 13:39:22.398680] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:08.139 [2024-06-10 13:39:22.398698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:08.139 BaseBdev2 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:08.139 [2024-06-10 13:39:22.585888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:08.139 [2024-06-10 13:39:22.586938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:08.139 [2024-06-10 13:39:22.587086] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd65e20 00:11:08.139 [2024-06-10 13:39:22.587095] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:08.139 [2024-06-10 13:39:22.587253] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb7450 00:11:08.139 [2024-06-10 13:39:22.587378] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd65e20 00:11:08.139 [2024-06-10 13:39:22.587384] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd65e20 00:11:08.139 [2024-06-10 13:39:22.587462] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.139 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:08.399 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:08.399 "name": "raid_bdev1", 00:11:08.399 "uuid": "89d104fd-72d1-43e2-b10e-8ee2416c82ae", 00:11:08.399 "strip_size_kb": 0, 00:11:08.399 "state": "online", 00:11:08.399 "raid_level": "raid1", 00:11:08.399 "superblock": true, 00:11:08.399 "num_base_bdevs": 2, 00:11:08.399 "num_base_bdevs_discovered": 2, 00:11:08.399 "num_base_bdevs_operational": 2, 00:11:08.399 "base_bdevs_list": [ 00:11:08.399 { 00:11:08.399 "name": "BaseBdev1", 00:11:08.399 "uuid": "a8e01a2c-aa30-5a6a-8098-d3881b2bf4ca", 00:11:08.399 "is_configured": true, 00:11:08.399 "data_offset": 2048, 00:11:08.399 "data_size": 63488 00:11:08.399 }, 00:11:08.399 { 00:11:08.399 "name": "BaseBdev2", 00:11:08.399 "uuid": "e875fb6b-6673-50ee-850d-7e9e5aa5e63d", 00:11:08.399 "is_configured": true, 00:11:08.399 "data_offset": 2048, 00:11:08.399 "data_size": 63488 00:11:08.399 } 00:11:08.399 ] 00:11:08.399 }' 00:11:08.399 13:39:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:08.399 13:39:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.969 13:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:08.969 13:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:09.229 [2024-06-10 13:39:23.464312] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd6a030 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.170 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:10.171 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.431 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:10.431 "name": "raid_bdev1", 00:11:10.431 "uuid": "89d104fd-72d1-43e2-b10e-8ee2416c82ae", 00:11:10.431 "strip_size_kb": 0, 00:11:10.431 "state": "online", 00:11:10.431 "raid_level": "raid1", 00:11:10.431 "superblock": true, 00:11:10.431 "num_base_bdevs": 2, 00:11:10.431 "num_base_bdevs_discovered": 2, 00:11:10.431 "num_base_bdevs_operational": 2, 00:11:10.431 "base_bdevs_list": [ 00:11:10.431 { 00:11:10.431 "name": "BaseBdev1", 00:11:10.431 "uuid": "a8e01a2c-aa30-5a6a-8098-d3881b2bf4ca", 00:11:10.431 "is_configured": true, 00:11:10.431 "data_offset": 2048, 00:11:10.431 "data_size": 63488 00:11:10.431 }, 00:11:10.431 { 00:11:10.431 "name": "BaseBdev2", 00:11:10.431 "uuid": "e875fb6b-6673-50ee-850d-7e9e5aa5e63d", 00:11:10.431 "is_configured": true, 00:11:10.431 "data_offset": 2048, 00:11:10.431 "data_size": 63488 00:11:10.431 } 00:11:10.431 ] 00:11:10.431 }' 00:11:10.431 13:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:10.431 13:39:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.001 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:11.261 [2024-06-10 13:39:25.541703] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:11.261 [2024-06-10 13:39:25.541732] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:11.261 [2024-06-10 13:39:25.544520] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:11.261 [2024-06-10 13:39:25.544545] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:11.261 [2024-06-10 13:39:25.544607] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:11.261 [2024-06-10 13:39:25.544620] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd65e20 name raid_bdev1, state offline 00:11:11.261 0 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1510738 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1510738 ']' 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1510738 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1510738 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1510738' 00:11:11.261 killing process with pid 1510738 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1510738 00:11:11.261 [2024-06-10 13:39:25.610972] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:11.261 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1510738 00:11:11.261 [2024-06-10 13:39:25.616213] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.KYQRY0QiZF 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:11.522 00:11:11.522 real 0m5.441s 00:11:11.522 user 0m8.589s 00:11:11.522 sys 0m0.754s 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:11.522 13:39:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.522 ************************************ 00:11:11.522 END TEST raid_read_error_test 00:11:11.522 ************************************ 00:11:11.522 13:39:25 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:11:11.522 13:39:25 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:11.522 13:39:25 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:11.522 13:39:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:11.522 ************************************ 00:11:11.522 START TEST raid_write_error_test 00:11:11.522 ************************************ 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 write 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ilRzs2Wynp 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1511834 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1511834 /var/tmp/spdk-raid.sock 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1511834 ']' 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:11.522 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:11.522 13:39:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.522 [2024-06-10 13:39:25.859399] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:11:11.522 [2024-06-10 13:39:25.859434] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1511834 ] 00:11:11.522 [2024-06-10 13:39:25.939182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:11.784 [2024-06-10 13:39:26.004035] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:11:11.784 [2024-06-10 13:39:26.050837] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.784 [2024-06-10 13:39:26.050864] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:11.784 13:39:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:11.784 13:39:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:11:11.784 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:11.784 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:11.784 BaseBdev1_malloc 00:11:11.784 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:12.044 true 00:11:12.044 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:12.044 [2024-06-10 13:39:26.492746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:12.044 [2024-06-10 13:39:26.492780] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.044 [2024-06-10 13:39:26.492791] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ccdc90 00:11:12.044 [2024-06-10 13:39:26.492798] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.045 [2024-06-10 13:39:26.494234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.045 [2024-06-10 13:39:26.494255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:12.045 BaseBdev1 00:11:12.045 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:12.045 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:12.306 BaseBdev2_malloc 00:11:12.306 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:12.566 true 00:11:12.566 13:39:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:12.566 [2024-06-10 13:39:27.024122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:12.566 [2024-06-10 13:39:27.024150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.566 [2024-06-10 13:39:27.024164] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd2400 00:11:12.566 [2024-06-10 13:39:27.024172] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.566 [2024-06-10 13:39:27.025438] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.566 [2024-06-10 13:39:27.025457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:12.566 BaseBdev2 00:11:12.566 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:12.826 [2024-06-10 13:39:27.160489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:12.826 [2024-06-10 13:39:27.161554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:12.826 [2024-06-10 13:39:27.161702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd1e20 00:11:12.826 [2024-06-10 13:39:27.161711] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:12.826 [2024-06-10 13:39:27.161865] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b23450 00:11:12.826 [2024-06-10 13:39:27.161985] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd1e20 00:11:12.826 [2024-06-10 13:39:27.161991] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cd1e20 00:11:12.826 [2024-06-10 13:39:27.162069] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.826 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:13.087 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.087 "name": "raid_bdev1", 00:11:13.087 "uuid": "4fcb5f6e-714e-437f-8af4-b33d22ec2949", 00:11:13.087 "strip_size_kb": 0, 00:11:13.087 "state": "online", 00:11:13.087 "raid_level": "raid1", 00:11:13.087 "superblock": true, 00:11:13.087 "num_base_bdevs": 2, 00:11:13.087 "num_base_bdevs_discovered": 2, 00:11:13.087 "num_base_bdevs_operational": 2, 00:11:13.087 "base_bdevs_list": [ 00:11:13.087 { 00:11:13.087 "name": "BaseBdev1", 00:11:13.087 "uuid": "65767d77-adba-5324-8ece-8e92678d8af9", 00:11:13.087 "is_configured": true, 00:11:13.087 "data_offset": 2048, 00:11:13.087 "data_size": 63488 00:11:13.087 }, 00:11:13.087 { 00:11:13.087 "name": "BaseBdev2", 00:11:13.087 "uuid": "e1efd921-c3e2-5174-ab6d-cdfbf45d2788", 00:11:13.087 "is_configured": true, 00:11:13.087 "data_offset": 2048, 00:11:13.087 "data_size": 63488 00:11:13.087 } 00:11:13.087 ] 00:11:13.087 }' 00:11:13.087 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.087 13:39:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.658 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:13.658 13:39:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:13.658 [2024-06-10 13:39:27.986798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd6030 00:11:14.599 13:39:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:14.599 [2024-06-10 13:39:29.055947] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:11:14.599 [2024-06-10 13:39:29.055986] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:14.599 [2024-06-10 13:39:29.056152] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cd6030 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.599 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:14.859 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:14.859 "name": "raid_bdev1", 00:11:14.859 "uuid": "4fcb5f6e-714e-437f-8af4-b33d22ec2949", 00:11:14.859 "strip_size_kb": 0, 00:11:14.859 "state": "online", 00:11:14.859 "raid_level": "raid1", 00:11:14.859 "superblock": true, 00:11:14.859 "num_base_bdevs": 2, 00:11:14.859 "num_base_bdevs_discovered": 1, 00:11:14.859 "num_base_bdevs_operational": 1, 00:11:14.859 "base_bdevs_list": [ 00:11:14.859 { 00:11:14.859 "name": null, 00:11:14.859 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:14.859 "is_configured": false, 00:11:14.859 "data_offset": 2048, 00:11:14.859 "data_size": 63488 00:11:14.859 }, 00:11:14.859 { 00:11:14.859 "name": "BaseBdev2", 00:11:14.859 "uuid": "e1efd921-c3e2-5174-ab6d-cdfbf45d2788", 00:11:14.859 "is_configured": true, 00:11:14.859 "data_offset": 2048, 00:11:14.859 "data_size": 63488 00:11:14.859 } 00:11:14.859 ] 00:11:14.859 }' 00:11:14.859 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:14.859 13:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.428 13:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:15.689 [2024-06-10 13:39:30.038986] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:15.689 [2024-06-10 13:39:30.039012] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:15.689 [2024-06-10 13:39:30.041778] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:15.689 [2024-06-10 13:39:30.041803] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:15.689 [2024-06-10 13:39:30.041844] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:15.689 [2024-06-10 13:39:30.041850] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd1e20 name raid_bdev1, state offline 00:11:15.689 0 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1511834 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1511834 ']' 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1511834 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1511834 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1511834' 00:11:15.689 killing process with pid 1511834 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1511834 00:11:15.689 [2024-06-10 13:39:30.084943] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:15.689 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1511834 00:11:15.689 [2024-06-10 13:39:30.090636] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ilRzs2Wynp 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:15.949 00:11:15.949 real 0m4.405s 00:11:15.949 user 0m7.078s 00:11:15.949 sys 0m0.631s 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:15.949 13:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.949 ************************************ 00:11:15.949 END TEST raid_write_error_test 00:11:15.949 ************************************ 00:11:15.949 13:39:30 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:15.949 13:39:30 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:15.949 13:39:30 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:11:15.949 13:39:30 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:15.949 13:39:30 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:15.949 13:39:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:15.949 ************************************ 00:11:15.949 START TEST raid_state_function_test 00:11:15.949 ************************************ 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 false 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1512911 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1512911' 00:11:15.949 Process raid pid: 1512911 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1512911 /var/tmp/spdk-raid.sock 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1512911 ']' 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:15.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:15.949 13:39:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.949 [2024-06-10 13:39:30.366872] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:11:15.949 [2024-06-10 13:39:30.366918] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:16.210 [2024-06-10 13:39:30.454137] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:16.210 [2024-06-10 13:39:30.520409] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:11:16.210 [2024-06-10 13:39:30.563203] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:16.210 [2024-06-10 13:39:30.563225] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:16.780 13:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:16.781 13:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:11:16.781 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:17.040 [2024-06-10 13:39:31.407211] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:17.040 [2024-06-10 13:39:31.407241] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:17.040 [2024-06-10 13:39:31.407247] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:17.040 [2024-06-10 13:39:31.407253] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:17.040 [2024-06-10 13:39:31.407258] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:17.040 [2024-06-10 13:39:31.407264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.040 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.300 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.300 "name": "Existed_Raid", 00:11:17.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.300 "strip_size_kb": 64, 00:11:17.300 "state": "configuring", 00:11:17.300 "raid_level": "raid0", 00:11:17.300 "superblock": false, 00:11:17.300 "num_base_bdevs": 3, 00:11:17.300 "num_base_bdevs_discovered": 0, 00:11:17.300 "num_base_bdevs_operational": 3, 00:11:17.300 "base_bdevs_list": [ 00:11:17.300 { 00:11:17.300 "name": "BaseBdev1", 00:11:17.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.300 "is_configured": false, 00:11:17.300 "data_offset": 0, 00:11:17.300 "data_size": 0 00:11:17.300 }, 00:11:17.300 { 00:11:17.300 "name": "BaseBdev2", 00:11:17.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.300 "is_configured": false, 00:11:17.300 "data_offset": 0, 00:11:17.300 "data_size": 0 00:11:17.300 }, 00:11:17.300 { 00:11:17.300 "name": "BaseBdev3", 00:11:17.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.301 "is_configured": false, 00:11:17.301 "data_offset": 0, 00:11:17.301 "data_size": 0 00:11:17.301 } 00:11:17.301 ] 00:11:17.301 }' 00:11:17.301 13:39:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.301 13:39:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.869 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:18.130 [2024-06-10 13:39:32.357496] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:18.130 [2024-06-10 13:39:32.357515] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1554740 name Existed_Raid, state configuring 00:11:18.130 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:18.130 [2024-06-10 13:39:32.550004] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:18.130 [2024-06-10 13:39:32.550020] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:18.130 [2024-06-10 13:39:32.550025] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:18.130 [2024-06-10 13:39:32.550031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:18.130 [2024-06-10 13:39:32.550036] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:18.130 [2024-06-10 13:39:32.550041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:18.130 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:18.390 [2024-06-10 13:39:32.749387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:18.390 BaseBdev1 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:18.390 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:18.651 13:39:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:18.911 [ 00:11:18.911 { 00:11:18.911 "name": "BaseBdev1", 00:11:18.911 "aliases": [ 00:11:18.911 "46a1cb72-e792-4228-9656-0cf8291cfb0a" 00:11:18.911 ], 00:11:18.911 "product_name": "Malloc disk", 00:11:18.911 "block_size": 512, 00:11:18.911 "num_blocks": 65536, 00:11:18.911 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:18.911 "assigned_rate_limits": { 00:11:18.911 "rw_ios_per_sec": 0, 00:11:18.911 "rw_mbytes_per_sec": 0, 00:11:18.911 "r_mbytes_per_sec": 0, 00:11:18.911 "w_mbytes_per_sec": 0 00:11:18.911 }, 00:11:18.911 "claimed": true, 00:11:18.911 "claim_type": "exclusive_write", 00:11:18.911 "zoned": false, 00:11:18.911 "supported_io_types": { 00:11:18.911 "read": true, 00:11:18.911 "write": true, 00:11:18.911 "unmap": true, 00:11:18.911 "write_zeroes": true, 00:11:18.911 "flush": true, 00:11:18.911 "reset": true, 00:11:18.911 "compare": false, 00:11:18.911 "compare_and_write": false, 00:11:18.911 "abort": true, 00:11:18.911 "nvme_admin": false, 00:11:18.911 "nvme_io": false 00:11:18.911 }, 00:11:18.911 "memory_domains": [ 00:11:18.911 { 00:11:18.911 "dma_device_id": "system", 00:11:18.911 "dma_device_type": 1 00:11:18.911 }, 00:11:18.911 { 00:11:18.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.911 "dma_device_type": 2 00:11:18.911 } 00:11:18.911 ], 00:11:18.911 "driver_specific": {} 00:11:18.911 } 00:11:18.911 ] 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.911 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.912 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.912 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.912 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.912 "name": "Existed_Raid", 00:11:18.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.912 "strip_size_kb": 64, 00:11:18.912 "state": "configuring", 00:11:18.912 "raid_level": "raid0", 00:11:18.912 "superblock": false, 00:11:18.912 "num_base_bdevs": 3, 00:11:18.912 "num_base_bdevs_discovered": 1, 00:11:18.912 "num_base_bdevs_operational": 3, 00:11:18.912 "base_bdevs_list": [ 00:11:18.912 { 00:11:18.912 "name": "BaseBdev1", 00:11:18.912 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:18.912 "is_configured": true, 00:11:18.912 "data_offset": 0, 00:11:18.912 "data_size": 65536 00:11:18.912 }, 00:11:18.912 { 00:11:18.912 "name": "BaseBdev2", 00:11:18.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.912 "is_configured": false, 00:11:18.912 "data_offset": 0, 00:11:18.912 "data_size": 0 00:11:18.912 }, 00:11:18.912 { 00:11:18.912 "name": "BaseBdev3", 00:11:18.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.912 "is_configured": false, 00:11:18.912 "data_offset": 0, 00:11:18.912 "data_size": 0 00:11:18.912 } 00:11:18.912 ] 00:11:18.912 }' 00:11:18.912 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.912 13:39:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.484 13:39:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:19.745 [2024-06-10 13:39:34.092783] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:19.745 [2024-06-10 13:39:34.092808] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1554010 name Existed_Raid, state configuring 00:11:19.745 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:20.006 [2024-06-10 13:39:34.297331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:20.006 [2024-06-10 13:39:34.298535] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:20.006 [2024-06-10 13:39:34.298557] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:20.006 [2024-06-10 13:39:34.298563] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:20.006 [2024-06-10 13:39:34.298569] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.006 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.267 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.267 "name": "Existed_Raid", 00:11:20.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.267 "strip_size_kb": 64, 00:11:20.267 "state": "configuring", 00:11:20.267 "raid_level": "raid0", 00:11:20.267 "superblock": false, 00:11:20.267 "num_base_bdevs": 3, 00:11:20.267 "num_base_bdevs_discovered": 1, 00:11:20.267 "num_base_bdevs_operational": 3, 00:11:20.267 "base_bdevs_list": [ 00:11:20.267 { 00:11:20.267 "name": "BaseBdev1", 00:11:20.267 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:20.267 "is_configured": true, 00:11:20.267 "data_offset": 0, 00:11:20.267 "data_size": 65536 00:11:20.267 }, 00:11:20.267 { 00:11:20.267 "name": "BaseBdev2", 00:11:20.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.267 "is_configured": false, 00:11:20.267 "data_offset": 0, 00:11:20.267 "data_size": 0 00:11:20.267 }, 00:11:20.267 { 00:11:20.267 "name": "BaseBdev3", 00:11:20.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.267 "is_configured": false, 00:11:20.267 "data_offset": 0, 00:11:20.267 "data_size": 0 00:11:20.267 } 00:11:20.267 ] 00:11:20.267 }' 00:11:20.267 13:39:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.267 13:39:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:20.837 [2024-06-10 13:39:35.180629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:20.837 BaseBdev2 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:20.837 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:21.097 [ 00:11:21.097 { 00:11:21.097 "name": "BaseBdev2", 00:11:21.097 "aliases": [ 00:11:21.097 "99bc7d7a-efc7-472d-b7c1-5ab7e1275836" 00:11:21.097 ], 00:11:21.097 "product_name": "Malloc disk", 00:11:21.097 "block_size": 512, 00:11:21.097 "num_blocks": 65536, 00:11:21.097 "uuid": "99bc7d7a-efc7-472d-b7c1-5ab7e1275836", 00:11:21.097 "assigned_rate_limits": { 00:11:21.097 "rw_ios_per_sec": 0, 00:11:21.097 "rw_mbytes_per_sec": 0, 00:11:21.097 "r_mbytes_per_sec": 0, 00:11:21.097 "w_mbytes_per_sec": 0 00:11:21.097 }, 00:11:21.097 "claimed": true, 00:11:21.097 "claim_type": "exclusive_write", 00:11:21.097 "zoned": false, 00:11:21.097 "supported_io_types": { 00:11:21.097 "read": true, 00:11:21.097 "write": true, 00:11:21.097 "unmap": true, 00:11:21.097 "write_zeroes": true, 00:11:21.097 "flush": true, 00:11:21.097 "reset": true, 00:11:21.097 "compare": false, 00:11:21.097 "compare_and_write": false, 00:11:21.097 "abort": true, 00:11:21.097 "nvme_admin": false, 00:11:21.097 "nvme_io": false 00:11:21.097 }, 00:11:21.097 "memory_domains": [ 00:11:21.097 { 00:11:21.097 "dma_device_id": "system", 00:11:21.097 "dma_device_type": 1 00:11:21.097 }, 00:11:21.097 { 00:11:21.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.097 "dma_device_type": 2 00:11:21.097 } 00:11:21.097 ], 00:11:21.097 "driver_specific": {} 00:11:21.097 } 00:11:21.097 ] 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.097 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:21.098 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.098 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.098 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.098 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.098 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.098 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:21.357 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.357 "name": "Existed_Raid", 00:11:21.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.357 "strip_size_kb": 64, 00:11:21.357 "state": "configuring", 00:11:21.357 "raid_level": "raid0", 00:11:21.357 "superblock": false, 00:11:21.357 "num_base_bdevs": 3, 00:11:21.357 "num_base_bdevs_discovered": 2, 00:11:21.357 "num_base_bdevs_operational": 3, 00:11:21.357 "base_bdevs_list": [ 00:11:21.357 { 00:11:21.357 "name": "BaseBdev1", 00:11:21.357 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:21.357 "is_configured": true, 00:11:21.357 "data_offset": 0, 00:11:21.357 "data_size": 65536 00:11:21.357 }, 00:11:21.357 { 00:11:21.357 "name": "BaseBdev2", 00:11:21.357 "uuid": "99bc7d7a-efc7-472d-b7c1-5ab7e1275836", 00:11:21.357 "is_configured": true, 00:11:21.357 "data_offset": 0, 00:11:21.357 "data_size": 65536 00:11:21.357 }, 00:11:21.357 { 00:11:21.357 "name": "BaseBdev3", 00:11:21.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.357 "is_configured": false, 00:11:21.357 "data_offset": 0, 00:11:21.357 "data_size": 0 00:11:21.357 } 00:11:21.357 ] 00:11:21.357 }' 00:11:21.357 13:39:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.357 13:39:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:21.928 [2024-06-10 13:39:36.388804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:21.928 [2024-06-10 13:39:36.388828] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1554f00 00:11:21.928 [2024-06-10 13:39:36.388832] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:21.928 [2024-06-10 13:39:36.388989] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x156bdf0 00:11:21.928 [2024-06-10 13:39:36.389086] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1554f00 00:11:21.928 [2024-06-10 13:39:36.389095] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1554f00 00:11:21.928 [2024-06-10 13:39:36.389225] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:21.928 BaseBdev3 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:21.928 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:22.189 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:22.450 [ 00:11:22.450 { 00:11:22.450 "name": "BaseBdev3", 00:11:22.450 "aliases": [ 00:11:22.450 "280891b6-0b20-467c-b370-f5ebcedff090" 00:11:22.450 ], 00:11:22.450 "product_name": "Malloc disk", 00:11:22.450 "block_size": 512, 00:11:22.450 "num_blocks": 65536, 00:11:22.450 "uuid": "280891b6-0b20-467c-b370-f5ebcedff090", 00:11:22.450 "assigned_rate_limits": { 00:11:22.450 "rw_ios_per_sec": 0, 00:11:22.450 "rw_mbytes_per_sec": 0, 00:11:22.450 "r_mbytes_per_sec": 0, 00:11:22.450 "w_mbytes_per_sec": 0 00:11:22.450 }, 00:11:22.450 "claimed": true, 00:11:22.450 "claim_type": "exclusive_write", 00:11:22.450 "zoned": false, 00:11:22.450 "supported_io_types": { 00:11:22.450 "read": true, 00:11:22.450 "write": true, 00:11:22.450 "unmap": true, 00:11:22.450 "write_zeroes": true, 00:11:22.450 "flush": true, 00:11:22.450 "reset": true, 00:11:22.450 "compare": false, 00:11:22.450 "compare_and_write": false, 00:11:22.450 "abort": true, 00:11:22.450 "nvme_admin": false, 00:11:22.450 "nvme_io": false 00:11:22.450 }, 00:11:22.450 "memory_domains": [ 00:11:22.450 { 00:11:22.450 "dma_device_id": "system", 00:11:22.450 "dma_device_type": 1 00:11:22.450 }, 00:11:22.450 { 00:11:22.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.450 "dma_device_type": 2 00:11:22.450 } 00:11:22.450 ], 00:11:22.450 "driver_specific": {} 00:11:22.450 } 00:11:22.450 ] 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.450 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.712 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.712 "name": "Existed_Raid", 00:11:22.712 "uuid": "ea77630e-33eb-40e6-8086-1d060e44515b", 00:11:22.712 "strip_size_kb": 64, 00:11:22.713 "state": "online", 00:11:22.713 "raid_level": "raid0", 00:11:22.713 "superblock": false, 00:11:22.713 "num_base_bdevs": 3, 00:11:22.713 "num_base_bdevs_discovered": 3, 00:11:22.713 "num_base_bdevs_operational": 3, 00:11:22.713 "base_bdevs_list": [ 00:11:22.713 { 00:11:22.713 "name": "BaseBdev1", 00:11:22.713 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:22.713 "is_configured": true, 00:11:22.713 "data_offset": 0, 00:11:22.713 "data_size": 65536 00:11:22.713 }, 00:11:22.713 { 00:11:22.713 "name": "BaseBdev2", 00:11:22.713 "uuid": "99bc7d7a-efc7-472d-b7c1-5ab7e1275836", 00:11:22.713 "is_configured": true, 00:11:22.713 "data_offset": 0, 00:11:22.713 "data_size": 65536 00:11:22.713 }, 00:11:22.713 { 00:11:22.713 "name": "BaseBdev3", 00:11:22.713 "uuid": "280891b6-0b20-467c-b370-f5ebcedff090", 00:11:22.713 "is_configured": true, 00:11:22.713 "data_offset": 0, 00:11:22.713 "data_size": 65536 00:11:22.713 } 00:11:22.713 ] 00:11:22.713 }' 00:11:22.713 13:39:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.713 13:39:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:23.284 [2024-06-10 13:39:37.672287] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:23.284 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:23.284 "name": "Existed_Raid", 00:11:23.284 "aliases": [ 00:11:23.285 "ea77630e-33eb-40e6-8086-1d060e44515b" 00:11:23.285 ], 00:11:23.285 "product_name": "Raid Volume", 00:11:23.285 "block_size": 512, 00:11:23.285 "num_blocks": 196608, 00:11:23.285 "uuid": "ea77630e-33eb-40e6-8086-1d060e44515b", 00:11:23.285 "assigned_rate_limits": { 00:11:23.285 "rw_ios_per_sec": 0, 00:11:23.285 "rw_mbytes_per_sec": 0, 00:11:23.285 "r_mbytes_per_sec": 0, 00:11:23.285 "w_mbytes_per_sec": 0 00:11:23.285 }, 00:11:23.285 "claimed": false, 00:11:23.285 "zoned": false, 00:11:23.285 "supported_io_types": { 00:11:23.285 "read": true, 00:11:23.285 "write": true, 00:11:23.285 "unmap": true, 00:11:23.285 "write_zeroes": true, 00:11:23.285 "flush": true, 00:11:23.285 "reset": true, 00:11:23.285 "compare": false, 00:11:23.285 "compare_and_write": false, 00:11:23.285 "abort": false, 00:11:23.285 "nvme_admin": false, 00:11:23.285 "nvme_io": false 00:11:23.285 }, 00:11:23.285 "memory_domains": [ 00:11:23.285 { 00:11:23.285 "dma_device_id": "system", 00:11:23.285 "dma_device_type": 1 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.285 "dma_device_type": 2 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "dma_device_id": "system", 00:11:23.285 "dma_device_type": 1 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.285 "dma_device_type": 2 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "dma_device_id": "system", 00:11:23.285 "dma_device_type": 1 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.285 "dma_device_type": 2 00:11:23.285 } 00:11:23.285 ], 00:11:23.285 "driver_specific": { 00:11:23.285 "raid": { 00:11:23.285 "uuid": "ea77630e-33eb-40e6-8086-1d060e44515b", 00:11:23.285 "strip_size_kb": 64, 00:11:23.285 "state": "online", 00:11:23.285 "raid_level": "raid0", 00:11:23.285 "superblock": false, 00:11:23.285 "num_base_bdevs": 3, 00:11:23.285 "num_base_bdevs_discovered": 3, 00:11:23.285 "num_base_bdevs_operational": 3, 00:11:23.285 "base_bdevs_list": [ 00:11:23.285 { 00:11:23.285 "name": "BaseBdev1", 00:11:23.285 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:23.285 "is_configured": true, 00:11:23.285 "data_offset": 0, 00:11:23.285 "data_size": 65536 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "name": "BaseBdev2", 00:11:23.285 "uuid": "99bc7d7a-efc7-472d-b7c1-5ab7e1275836", 00:11:23.285 "is_configured": true, 00:11:23.285 "data_offset": 0, 00:11:23.285 "data_size": 65536 00:11:23.285 }, 00:11:23.285 { 00:11:23.285 "name": "BaseBdev3", 00:11:23.285 "uuid": "280891b6-0b20-467c-b370-f5ebcedff090", 00:11:23.285 "is_configured": true, 00:11:23.285 "data_offset": 0, 00:11:23.285 "data_size": 65536 00:11:23.285 } 00:11:23.285 ] 00:11:23.285 } 00:11:23.285 } 00:11:23.285 }' 00:11:23.285 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:23.285 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:23.285 BaseBdev2 00:11:23.285 BaseBdev3' 00:11:23.285 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:23.285 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:23.285 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:23.546 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:23.546 "name": "BaseBdev1", 00:11:23.546 "aliases": [ 00:11:23.546 "46a1cb72-e792-4228-9656-0cf8291cfb0a" 00:11:23.546 ], 00:11:23.546 "product_name": "Malloc disk", 00:11:23.546 "block_size": 512, 00:11:23.546 "num_blocks": 65536, 00:11:23.546 "uuid": "46a1cb72-e792-4228-9656-0cf8291cfb0a", 00:11:23.546 "assigned_rate_limits": { 00:11:23.546 "rw_ios_per_sec": 0, 00:11:23.546 "rw_mbytes_per_sec": 0, 00:11:23.546 "r_mbytes_per_sec": 0, 00:11:23.546 "w_mbytes_per_sec": 0 00:11:23.546 }, 00:11:23.546 "claimed": true, 00:11:23.546 "claim_type": "exclusive_write", 00:11:23.546 "zoned": false, 00:11:23.546 "supported_io_types": { 00:11:23.546 "read": true, 00:11:23.546 "write": true, 00:11:23.546 "unmap": true, 00:11:23.546 "write_zeroes": true, 00:11:23.546 "flush": true, 00:11:23.546 "reset": true, 00:11:23.546 "compare": false, 00:11:23.546 "compare_and_write": false, 00:11:23.546 "abort": true, 00:11:23.546 "nvme_admin": false, 00:11:23.546 "nvme_io": false 00:11:23.546 }, 00:11:23.546 "memory_domains": [ 00:11:23.546 { 00:11:23.546 "dma_device_id": "system", 00:11:23.546 "dma_device_type": 1 00:11:23.546 }, 00:11:23.546 { 00:11:23.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.546 "dma_device_type": 2 00:11:23.546 } 00:11:23.546 ], 00:11:23.546 "driver_specific": {} 00:11:23.546 }' 00:11:23.546 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:23.546 13:39:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:23.546 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:23.546 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:23.807 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:24.069 "name": "BaseBdev2", 00:11:24.069 "aliases": [ 00:11:24.069 "99bc7d7a-efc7-472d-b7c1-5ab7e1275836" 00:11:24.069 ], 00:11:24.069 "product_name": "Malloc disk", 00:11:24.069 "block_size": 512, 00:11:24.069 "num_blocks": 65536, 00:11:24.069 "uuid": "99bc7d7a-efc7-472d-b7c1-5ab7e1275836", 00:11:24.069 "assigned_rate_limits": { 00:11:24.069 "rw_ios_per_sec": 0, 00:11:24.069 "rw_mbytes_per_sec": 0, 00:11:24.069 "r_mbytes_per_sec": 0, 00:11:24.069 "w_mbytes_per_sec": 0 00:11:24.069 }, 00:11:24.069 "claimed": true, 00:11:24.069 "claim_type": "exclusive_write", 00:11:24.069 "zoned": false, 00:11:24.069 "supported_io_types": { 00:11:24.069 "read": true, 00:11:24.069 "write": true, 00:11:24.069 "unmap": true, 00:11:24.069 "write_zeroes": true, 00:11:24.069 "flush": true, 00:11:24.069 "reset": true, 00:11:24.069 "compare": false, 00:11:24.069 "compare_and_write": false, 00:11:24.069 "abort": true, 00:11:24.069 "nvme_admin": false, 00:11:24.069 "nvme_io": false 00:11:24.069 }, 00:11:24.069 "memory_domains": [ 00:11:24.069 { 00:11:24.069 "dma_device_id": "system", 00:11:24.069 "dma_device_type": 1 00:11:24.069 }, 00:11:24.069 { 00:11:24.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.069 "dma_device_type": 2 00:11:24.069 } 00:11:24.069 ], 00:11:24.069 "driver_specific": {} 00:11:24.069 }' 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.069 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.364 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:24.364 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.364 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.364 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:24.365 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:24.365 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:24.365 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:24.365 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:24.365 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:24.656 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:24.656 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:24.656 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:24.656 13:39:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:24.656 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:24.656 "name": "BaseBdev3", 00:11:24.656 "aliases": [ 00:11:24.656 "280891b6-0b20-467c-b370-f5ebcedff090" 00:11:24.656 ], 00:11:24.656 "product_name": "Malloc disk", 00:11:24.656 "block_size": 512, 00:11:24.656 "num_blocks": 65536, 00:11:24.656 "uuid": "280891b6-0b20-467c-b370-f5ebcedff090", 00:11:24.656 "assigned_rate_limits": { 00:11:24.656 "rw_ios_per_sec": 0, 00:11:24.656 "rw_mbytes_per_sec": 0, 00:11:24.656 "r_mbytes_per_sec": 0, 00:11:24.656 "w_mbytes_per_sec": 0 00:11:24.656 }, 00:11:24.656 "claimed": true, 00:11:24.656 "claim_type": "exclusive_write", 00:11:24.656 "zoned": false, 00:11:24.656 "supported_io_types": { 00:11:24.656 "read": true, 00:11:24.656 "write": true, 00:11:24.656 "unmap": true, 00:11:24.656 "write_zeroes": true, 00:11:24.656 "flush": true, 00:11:24.656 "reset": true, 00:11:24.656 "compare": false, 00:11:24.656 "compare_and_write": false, 00:11:24.656 "abort": true, 00:11:24.656 "nvme_admin": false, 00:11:24.656 "nvme_io": false 00:11:24.656 }, 00:11:24.656 "memory_domains": [ 00:11:24.656 { 00:11:24.656 "dma_device_id": "system", 00:11:24.656 "dma_device_type": 1 00:11:24.656 }, 00:11:24.656 { 00:11:24.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.656 "dma_device_type": 2 00:11:24.656 } 00:11:24.656 ], 00:11:24.656 "driver_specific": {} 00:11:24.656 }' 00:11:24.656 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.656 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:24.917 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:25.178 [2024-06-10 13:39:39.601016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:25.178 [2024-06-10 13:39:39.601032] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:25.178 [2024-06-10 13:39:39.601070] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.178 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.438 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.438 "name": "Existed_Raid", 00:11:25.438 "uuid": "ea77630e-33eb-40e6-8086-1d060e44515b", 00:11:25.438 "strip_size_kb": 64, 00:11:25.438 "state": "offline", 00:11:25.438 "raid_level": "raid0", 00:11:25.438 "superblock": false, 00:11:25.438 "num_base_bdevs": 3, 00:11:25.438 "num_base_bdevs_discovered": 2, 00:11:25.438 "num_base_bdevs_operational": 2, 00:11:25.438 "base_bdevs_list": [ 00:11:25.438 { 00:11:25.438 "name": null, 00:11:25.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.438 "is_configured": false, 00:11:25.438 "data_offset": 0, 00:11:25.438 "data_size": 65536 00:11:25.438 }, 00:11:25.438 { 00:11:25.438 "name": "BaseBdev2", 00:11:25.438 "uuid": "99bc7d7a-efc7-472d-b7c1-5ab7e1275836", 00:11:25.438 "is_configured": true, 00:11:25.438 "data_offset": 0, 00:11:25.438 "data_size": 65536 00:11:25.438 }, 00:11:25.438 { 00:11:25.438 "name": "BaseBdev3", 00:11:25.438 "uuid": "280891b6-0b20-467c-b370-f5ebcedff090", 00:11:25.438 "is_configured": true, 00:11:25.438 "data_offset": 0, 00:11:25.438 "data_size": 65536 00:11:25.438 } 00:11:25.438 ] 00:11:25.438 }' 00:11:25.438 13:39:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.438 13:39:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.009 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:26.009 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:26.009 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.009 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:26.270 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:26.270 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:26.270 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:26.532 [2024-06-10 13:39:40.756009] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:26.532 13:39:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:26.793 [2024-06-10 13:39:41.163027] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:26.793 [2024-06-10 13:39:41.163057] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1554f00 name Existed_Raid, state offline 00:11:26.793 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:26.793 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:26.793 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.793 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:27.054 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:27.054 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:27.054 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:27.054 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:27.054 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:27.054 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:27.315 BaseBdev2 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:27.315 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:27.575 [ 00:11:27.575 { 00:11:27.575 "name": "BaseBdev2", 00:11:27.575 "aliases": [ 00:11:27.575 "8d095887-6e2e-4e8e-a275-87182915a61b" 00:11:27.575 ], 00:11:27.575 "product_name": "Malloc disk", 00:11:27.575 "block_size": 512, 00:11:27.575 "num_blocks": 65536, 00:11:27.575 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:27.575 "assigned_rate_limits": { 00:11:27.575 "rw_ios_per_sec": 0, 00:11:27.575 "rw_mbytes_per_sec": 0, 00:11:27.575 "r_mbytes_per_sec": 0, 00:11:27.575 "w_mbytes_per_sec": 0 00:11:27.575 }, 00:11:27.575 "claimed": false, 00:11:27.575 "zoned": false, 00:11:27.575 "supported_io_types": { 00:11:27.575 "read": true, 00:11:27.575 "write": true, 00:11:27.575 "unmap": true, 00:11:27.575 "write_zeroes": true, 00:11:27.575 "flush": true, 00:11:27.575 "reset": true, 00:11:27.575 "compare": false, 00:11:27.575 "compare_and_write": false, 00:11:27.575 "abort": true, 00:11:27.575 "nvme_admin": false, 00:11:27.575 "nvme_io": false 00:11:27.575 }, 00:11:27.575 "memory_domains": [ 00:11:27.575 { 00:11:27.575 "dma_device_id": "system", 00:11:27.575 "dma_device_type": 1 00:11:27.575 }, 00:11:27.575 { 00:11:27.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.575 "dma_device_type": 2 00:11:27.575 } 00:11:27.575 ], 00:11:27.575 "driver_specific": {} 00:11:27.575 } 00:11:27.575 ] 00:11:27.575 13:39:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:27.575 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:27.575 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:27.575 13:39:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:27.835 BaseBdev3 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:27.835 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:28.095 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:28.095 [ 00:11:28.095 { 00:11:28.095 "name": "BaseBdev3", 00:11:28.095 "aliases": [ 00:11:28.095 "e58e5506-8ef9-4010-8eee-296401d5d183" 00:11:28.095 ], 00:11:28.095 "product_name": "Malloc disk", 00:11:28.095 "block_size": 512, 00:11:28.095 "num_blocks": 65536, 00:11:28.095 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:28.095 "assigned_rate_limits": { 00:11:28.095 "rw_ios_per_sec": 0, 00:11:28.095 "rw_mbytes_per_sec": 0, 00:11:28.095 "r_mbytes_per_sec": 0, 00:11:28.095 "w_mbytes_per_sec": 0 00:11:28.095 }, 00:11:28.095 "claimed": false, 00:11:28.095 "zoned": false, 00:11:28.095 "supported_io_types": { 00:11:28.095 "read": true, 00:11:28.095 "write": true, 00:11:28.095 "unmap": true, 00:11:28.095 "write_zeroes": true, 00:11:28.095 "flush": true, 00:11:28.095 "reset": true, 00:11:28.095 "compare": false, 00:11:28.095 "compare_and_write": false, 00:11:28.095 "abort": true, 00:11:28.095 "nvme_admin": false, 00:11:28.095 "nvme_io": false 00:11:28.095 }, 00:11:28.095 "memory_domains": [ 00:11:28.095 { 00:11:28.095 "dma_device_id": "system", 00:11:28.095 "dma_device_type": 1 00:11:28.095 }, 00:11:28.095 { 00:11:28.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.095 "dma_device_type": 2 00:11:28.095 } 00:11:28.095 ], 00:11:28.095 "driver_specific": {} 00:11:28.095 } 00:11:28.095 ] 00:11:28.095 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:28.095 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:28.095 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:28.095 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:28.355 [2024-06-10 13:39:42.727183] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:28.355 [2024-06-10 13:39:42.727212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:28.355 [2024-06-10 13:39:42.727225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:28.355 [2024-06-10 13:39:42.728313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.355 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.616 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.616 "name": "Existed_Raid", 00:11:28.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.616 "strip_size_kb": 64, 00:11:28.616 "state": "configuring", 00:11:28.616 "raid_level": "raid0", 00:11:28.616 "superblock": false, 00:11:28.616 "num_base_bdevs": 3, 00:11:28.616 "num_base_bdevs_discovered": 2, 00:11:28.616 "num_base_bdevs_operational": 3, 00:11:28.616 "base_bdevs_list": [ 00:11:28.616 { 00:11:28.616 "name": "BaseBdev1", 00:11:28.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.616 "is_configured": false, 00:11:28.616 "data_offset": 0, 00:11:28.616 "data_size": 0 00:11:28.616 }, 00:11:28.616 { 00:11:28.616 "name": "BaseBdev2", 00:11:28.616 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:28.616 "is_configured": true, 00:11:28.616 "data_offset": 0, 00:11:28.616 "data_size": 65536 00:11:28.616 }, 00:11:28.616 { 00:11:28.616 "name": "BaseBdev3", 00:11:28.616 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:28.616 "is_configured": true, 00:11:28.616 "data_offset": 0, 00:11:28.616 "data_size": 65536 00:11:28.616 } 00:11:28.616 ] 00:11:28.616 }' 00:11:28.616 13:39:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.616 13:39:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.188 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:29.449 [2024-06-10 13:39:43.677574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.449 "name": "Existed_Raid", 00:11:29.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.449 "strip_size_kb": 64, 00:11:29.449 "state": "configuring", 00:11:29.449 "raid_level": "raid0", 00:11:29.449 "superblock": false, 00:11:29.449 "num_base_bdevs": 3, 00:11:29.449 "num_base_bdevs_discovered": 1, 00:11:29.449 "num_base_bdevs_operational": 3, 00:11:29.449 "base_bdevs_list": [ 00:11:29.449 { 00:11:29.449 "name": "BaseBdev1", 00:11:29.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.449 "is_configured": false, 00:11:29.449 "data_offset": 0, 00:11:29.449 "data_size": 0 00:11:29.449 }, 00:11:29.449 { 00:11:29.449 "name": null, 00:11:29.449 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:29.449 "is_configured": false, 00:11:29.449 "data_offset": 0, 00:11:29.449 "data_size": 65536 00:11:29.449 }, 00:11:29.449 { 00:11:29.449 "name": "BaseBdev3", 00:11:29.449 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:29.449 "is_configured": true, 00:11:29.449 "data_offset": 0, 00:11:29.449 "data_size": 65536 00:11:29.449 } 00:11:29.449 ] 00:11:29.449 }' 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.449 13:39:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.019 13:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.019 13:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:30.279 13:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:30.279 13:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:30.539 [2024-06-10 13:39:44.853660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:30.539 BaseBdev1 00:11:30.539 13:39:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:30.539 13:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:30.539 13:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:30.539 13:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:30.539 13:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:30.539 13:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:30.540 13:39:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:30.800 [ 00:11:30.800 { 00:11:30.800 "name": "BaseBdev1", 00:11:30.800 "aliases": [ 00:11:30.800 "b5b45320-ed84-4cc9-9869-1a9ba6f32aee" 00:11:30.800 ], 00:11:30.800 "product_name": "Malloc disk", 00:11:30.800 "block_size": 512, 00:11:30.800 "num_blocks": 65536, 00:11:30.800 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:30.800 "assigned_rate_limits": { 00:11:30.800 "rw_ios_per_sec": 0, 00:11:30.800 "rw_mbytes_per_sec": 0, 00:11:30.800 "r_mbytes_per_sec": 0, 00:11:30.800 "w_mbytes_per_sec": 0 00:11:30.800 }, 00:11:30.800 "claimed": true, 00:11:30.800 "claim_type": "exclusive_write", 00:11:30.800 "zoned": false, 00:11:30.800 "supported_io_types": { 00:11:30.800 "read": true, 00:11:30.800 "write": true, 00:11:30.800 "unmap": true, 00:11:30.800 "write_zeroes": true, 00:11:30.800 "flush": true, 00:11:30.800 "reset": true, 00:11:30.800 "compare": false, 00:11:30.800 "compare_and_write": false, 00:11:30.800 "abort": true, 00:11:30.800 "nvme_admin": false, 00:11:30.800 "nvme_io": false 00:11:30.800 }, 00:11:30.800 "memory_domains": [ 00:11:30.800 { 00:11:30.800 "dma_device_id": "system", 00:11:30.800 "dma_device_type": 1 00:11:30.800 }, 00:11:30.800 { 00:11:30.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.800 "dma_device_type": 2 00:11:30.800 } 00:11:30.800 ], 00:11:30.800 "driver_specific": {} 00:11:30.800 } 00:11:30.800 ] 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.800 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:31.061 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.061 "name": "Existed_Raid", 00:11:31.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.061 "strip_size_kb": 64, 00:11:31.061 "state": "configuring", 00:11:31.061 "raid_level": "raid0", 00:11:31.061 "superblock": false, 00:11:31.061 "num_base_bdevs": 3, 00:11:31.061 "num_base_bdevs_discovered": 2, 00:11:31.061 "num_base_bdevs_operational": 3, 00:11:31.061 "base_bdevs_list": [ 00:11:31.061 { 00:11:31.061 "name": "BaseBdev1", 00:11:31.061 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:31.061 "is_configured": true, 00:11:31.061 "data_offset": 0, 00:11:31.061 "data_size": 65536 00:11:31.061 }, 00:11:31.061 { 00:11:31.061 "name": null, 00:11:31.061 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:31.061 "is_configured": false, 00:11:31.061 "data_offset": 0, 00:11:31.061 "data_size": 65536 00:11:31.061 }, 00:11:31.061 { 00:11:31.061 "name": "BaseBdev3", 00:11:31.061 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:31.061 "is_configured": true, 00:11:31.061 "data_offset": 0, 00:11:31.061 "data_size": 65536 00:11:31.061 } 00:11:31.061 ] 00:11:31.061 }' 00:11:31.061 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.061 13:39:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.633 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.633 13:39:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:31.893 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:31.893 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:32.154 [2024-06-10 13:39:46.381576] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.154 "name": "Existed_Raid", 00:11:32.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.154 "strip_size_kb": 64, 00:11:32.154 "state": "configuring", 00:11:32.154 "raid_level": "raid0", 00:11:32.154 "superblock": false, 00:11:32.154 "num_base_bdevs": 3, 00:11:32.154 "num_base_bdevs_discovered": 1, 00:11:32.154 "num_base_bdevs_operational": 3, 00:11:32.154 "base_bdevs_list": [ 00:11:32.154 { 00:11:32.154 "name": "BaseBdev1", 00:11:32.154 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:32.154 "is_configured": true, 00:11:32.154 "data_offset": 0, 00:11:32.154 "data_size": 65536 00:11:32.154 }, 00:11:32.154 { 00:11:32.154 "name": null, 00:11:32.154 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:32.154 "is_configured": false, 00:11:32.154 "data_offset": 0, 00:11:32.154 "data_size": 65536 00:11:32.154 }, 00:11:32.154 { 00:11:32.154 "name": null, 00:11:32.154 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:32.154 "is_configured": false, 00:11:32.154 "data_offset": 0, 00:11:32.154 "data_size": 65536 00:11:32.154 } 00:11:32.154 ] 00:11:32.154 }' 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.154 13:39:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.725 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.725 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:32.986 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:32.986 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:33.246 [2024-06-10 13:39:47.508456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.246 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.246 "name": "Existed_Raid", 00:11:33.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.246 "strip_size_kb": 64, 00:11:33.246 "state": "configuring", 00:11:33.246 "raid_level": "raid0", 00:11:33.246 "superblock": false, 00:11:33.246 "num_base_bdevs": 3, 00:11:33.246 "num_base_bdevs_discovered": 2, 00:11:33.246 "num_base_bdevs_operational": 3, 00:11:33.246 "base_bdevs_list": [ 00:11:33.246 { 00:11:33.246 "name": "BaseBdev1", 00:11:33.246 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:33.246 "is_configured": true, 00:11:33.246 "data_offset": 0, 00:11:33.246 "data_size": 65536 00:11:33.246 }, 00:11:33.246 { 00:11:33.246 "name": null, 00:11:33.246 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:33.246 "is_configured": false, 00:11:33.246 "data_offset": 0, 00:11:33.246 "data_size": 65536 00:11:33.246 }, 00:11:33.246 { 00:11:33.246 "name": "BaseBdev3", 00:11:33.247 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:33.247 "is_configured": true, 00:11:33.247 "data_offset": 0, 00:11:33.247 "data_size": 65536 00:11:33.247 } 00:11:33.247 ] 00:11:33.247 }' 00:11:33.247 13:39:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.247 13:39:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.817 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.817 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:34.078 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:34.078 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:34.338 [2024-06-10 13:39:48.595235] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.338 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.597 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.597 "name": "Existed_Raid", 00:11:34.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.597 "strip_size_kb": 64, 00:11:34.597 "state": "configuring", 00:11:34.598 "raid_level": "raid0", 00:11:34.598 "superblock": false, 00:11:34.598 "num_base_bdevs": 3, 00:11:34.598 "num_base_bdevs_discovered": 1, 00:11:34.598 "num_base_bdevs_operational": 3, 00:11:34.598 "base_bdevs_list": [ 00:11:34.598 { 00:11:34.598 "name": null, 00:11:34.598 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:34.598 "is_configured": false, 00:11:34.598 "data_offset": 0, 00:11:34.598 "data_size": 65536 00:11:34.598 }, 00:11:34.598 { 00:11:34.598 "name": null, 00:11:34.598 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:34.598 "is_configured": false, 00:11:34.598 "data_offset": 0, 00:11:34.598 "data_size": 65536 00:11:34.598 }, 00:11:34.598 { 00:11:34.598 "name": "BaseBdev3", 00:11:34.598 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:34.598 "is_configured": true, 00:11:34.598 "data_offset": 0, 00:11:34.598 "data_size": 65536 00:11:34.598 } 00:11:34.598 ] 00:11:34.598 }' 00:11:34.598 13:39:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.598 13:39:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.169 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.169 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:35.169 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:35.169 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:35.429 [2024-06-10 13:39:49.776241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.429 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:35.689 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:35.689 "name": "Existed_Raid", 00:11:35.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:35.689 "strip_size_kb": 64, 00:11:35.689 "state": "configuring", 00:11:35.689 "raid_level": "raid0", 00:11:35.689 "superblock": false, 00:11:35.689 "num_base_bdevs": 3, 00:11:35.689 "num_base_bdevs_discovered": 2, 00:11:35.689 "num_base_bdevs_operational": 3, 00:11:35.689 "base_bdevs_list": [ 00:11:35.689 { 00:11:35.689 "name": null, 00:11:35.689 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:35.689 "is_configured": false, 00:11:35.689 "data_offset": 0, 00:11:35.690 "data_size": 65536 00:11:35.690 }, 00:11:35.690 { 00:11:35.690 "name": "BaseBdev2", 00:11:35.690 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:35.690 "is_configured": true, 00:11:35.690 "data_offset": 0, 00:11:35.690 "data_size": 65536 00:11:35.690 }, 00:11:35.690 { 00:11:35.690 "name": "BaseBdev3", 00:11:35.690 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:35.690 "is_configured": true, 00:11:35.690 "data_offset": 0, 00:11:35.690 "data_size": 65536 00:11:35.690 } 00:11:35.690 ] 00:11:35.690 }' 00:11:35.690 13:39:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:35.690 13:39:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.259 13:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.259 13:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:36.520 13:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:36.520 13:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.520 13:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:36.520 13:39:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b5b45320-ed84-4cc9-9869-1a9ba6f32aee 00:11:36.780 [2024-06-10 13:39:51.140802] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:36.780 [2024-06-10 13:39:51.140826] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1556090 00:11:36.780 [2024-06-10 13:39:51.140830] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:36.780 [2024-06-10 13:39:51.140986] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1554710 00:11:36.780 [2024-06-10 13:39:51.141078] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1556090 00:11:36.780 [2024-06-10 13:39:51.141084] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1556090 00:11:36.780 [2024-06-10 13:39:51.141216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:36.780 NewBaseBdev 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:36.780 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:37.040 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:37.300 [ 00:11:37.300 { 00:11:37.300 "name": "NewBaseBdev", 00:11:37.300 "aliases": [ 00:11:37.300 "b5b45320-ed84-4cc9-9869-1a9ba6f32aee" 00:11:37.300 ], 00:11:37.300 "product_name": "Malloc disk", 00:11:37.300 "block_size": 512, 00:11:37.300 "num_blocks": 65536, 00:11:37.300 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:37.300 "assigned_rate_limits": { 00:11:37.300 "rw_ios_per_sec": 0, 00:11:37.300 "rw_mbytes_per_sec": 0, 00:11:37.300 "r_mbytes_per_sec": 0, 00:11:37.300 "w_mbytes_per_sec": 0 00:11:37.300 }, 00:11:37.300 "claimed": true, 00:11:37.300 "claim_type": "exclusive_write", 00:11:37.300 "zoned": false, 00:11:37.300 "supported_io_types": { 00:11:37.300 "read": true, 00:11:37.300 "write": true, 00:11:37.300 "unmap": true, 00:11:37.300 "write_zeroes": true, 00:11:37.300 "flush": true, 00:11:37.300 "reset": true, 00:11:37.300 "compare": false, 00:11:37.300 "compare_and_write": false, 00:11:37.300 "abort": true, 00:11:37.300 "nvme_admin": false, 00:11:37.300 "nvme_io": false 00:11:37.300 }, 00:11:37.300 "memory_domains": [ 00:11:37.300 { 00:11:37.300 "dma_device_id": "system", 00:11:37.300 "dma_device_type": 1 00:11:37.300 }, 00:11:37.300 { 00:11:37.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.300 "dma_device_type": 2 00:11:37.300 } 00:11:37.300 ], 00:11:37.301 "driver_specific": {} 00:11:37.301 } 00:11:37.301 ] 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.301 "name": "Existed_Raid", 00:11:37.301 "uuid": "875cee98-709d-453f-a58f-da15deb75e0e", 00:11:37.301 "strip_size_kb": 64, 00:11:37.301 "state": "online", 00:11:37.301 "raid_level": "raid0", 00:11:37.301 "superblock": false, 00:11:37.301 "num_base_bdevs": 3, 00:11:37.301 "num_base_bdevs_discovered": 3, 00:11:37.301 "num_base_bdevs_operational": 3, 00:11:37.301 "base_bdevs_list": [ 00:11:37.301 { 00:11:37.301 "name": "NewBaseBdev", 00:11:37.301 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:37.301 "is_configured": true, 00:11:37.301 "data_offset": 0, 00:11:37.301 "data_size": 65536 00:11:37.301 }, 00:11:37.301 { 00:11:37.301 "name": "BaseBdev2", 00:11:37.301 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:37.301 "is_configured": true, 00:11:37.301 "data_offset": 0, 00:11:37.301 "data_size": 65536 00:11:37.301 }, 00:11:37.301 { 00:11:37.301 "name": "BaseBdev3", 00:11:37.301 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:37.301 "is_configured": true, 00:11:37.301 "data_offset": 0, 00:11:37.301 "data_size": 65536 00:11:37.301 } 00:11:37.301 ] 00:11:37.301 }' 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.301 13:39:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.870 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:37.870 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:37.871 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:37.871 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:37.871 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:37.871 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:37.871 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:37.871 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:38.131 [2024-06-10 13:39:52.524534] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:38.131 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:38.131 "name": "Existed_Raid", 00:11:38.131 "aliases": [ 00:11:38.131 "875cee98-709d-453f-a58f-da15deb75e0e" 00:11:38.131 ], 00:11:38.131 "product_name": "Raid Volume", 00:11:38.131 "block_size": 512, 00:11:38.131 "num_blocks": 196608, 00:11:38.131 "uuid": "875cee98-709d-453f-a58f-da15deb75e0e", 00:11:38.131 "assigned_rate_limits": { 00:11:38.131 "rw_ios_per_sec": 0, 00:11:38.131 "rw_mbytes_per_sec": 0, 00:11:38.131 "r_mbytes_per_sec": 0, 00:11:38.131 "w_mbytes_per_sec": 0 00:11:38.131 }, 00:11:38.131 "claimed": false, 00:11:38.131 "zoned": false, 00:11:38.131 "supported_io_types": { 00:11:38.131 "read": true, 00:11:38.131 "write": true, 00:11:38.131 "unmap": true, 00:11:38.131 "write_zeroes": true, 00:11:38.131 "flush": true, 00:11:38.131 "reset": true, 00:11:38.131 "compare": false, 00:11:38.131 "compare_and_write": false, 00:11:38.131 "abort": false, 00:11:38.131 "nvme_admin": false, 00:11:38.131 "nvme_io": false 00:11:38.131 }, 00:11:38.131 "memory_domains": [ 00:11:38.131 { 00:11:38.131 "dma_device_id": "system", 00:11:38.131 "dma_device_type": 1 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.131 "dma_device_type": 2 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "dma_device_id": "system", 00:11:38.131 "dma_device_type": 1 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.131 "dma_device_type": 2 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "dma_device_id": "system", 00:11:38.131 "dma_device_type": 1 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.131 "dma_device_type": 2 00:11:38.131 } 00:11:38.131 ], 00:11:38.131 "driver_specific": { 00:11:38.131 "raid": { 00:11:38.131 "uuid": "875cee98-709d-453f-a58f-da15deb75e0e", 00:11:38.131 "strip_size_kb": 64, 00:11:38.131 "state": "online", 00:11:38.131 "raid_level": "raid0", 00:11:38.131 "superblock": false, 00:11:38.131 "num_base_bdevs": 3, 00:11:38.131 "num_base_bdevs_discovered": 3, 00:11:38.131 "num_base_bdevs_operational": 3, 00:11:38.131 "base_bdevs_list": [ 00:11:38.131 { 00:11:38.131 "name": "NewBaseBdev", 00:11:38.131 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:38.131 "is_configured": true, 00:11:38.131 "data_offset": 0, 00:11:38.131 "data_size": 65536 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "name": "BaseBdev2", 00:11:38.131 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:38.131 "is_configured": true, 00:11:38.131 "data_offset": 0, 00:11:38.131 "data_size": 65536 00:11:38.131 }, 00:11:38.131 { 00:11:38.131 "name": "BaseBdev3", 00:11:38.131 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:38.131 "is_configured": true, 00:11:38.131 "data_offset": 0, 00:11:38.131 "data_size": 65536 00:11:38.131 } 00:11:38.131 ] 00:11:38.131 } 00:11:38.131 } 00:11:38.131 }' 00:11:38.131 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:38.131 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:38.131 BaseBdev2 00:11:38.131 BaseBdev3' 00:11:38.131 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.131 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:38.131 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.392 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.392 "name": "NewBaseBdev", 00:11:38.392 "aliases": [ 00:11:38.392 "b5b45320-ed84-4cc9-9869-1a9ba6f32aee" 00:11:38.392 ], 00:11:38.392 "product_name": "Malloc disk", 00:11:38.392 "block_size": 512, 00:11:38.392 "num_blocks": 65536, 00:11:38.392 "uuid": "b5b45320-ed84-4cc9-9869-1a9ba6f32aee", 00:11:38.392 "assigned_rate_limits": { 00:11:38.392 "rw_ios_per_sec": 0, 00:11:38.392 "rw_mbytes_per_sec": 0, 00:11:38.392 "r_mbytes_per_sec": 0, 00:11:38.392 "w_mbytes_per_sec": 0 00:11:38.392 }, 00:11:38.392 "claimed": true, 00:11:38.392 "claim_type": "exclusive_write", 00:11:38.392 "zoned": false, 00:11:38.392 "supported_io_types": { 00:11:38.392 "read": true, 00:11:38.392 "write": true, 00:11:38.392 "unmap": true, 00:11:38.392 "write_zeroes": true, 00:11:38.392 "flush": true, 00:11:38.392 "reset": true, 00:11:38.392 "compare": false, 00:11:38.392 "compare_and_write": false, 00:11:38.392 "abort": true, 00:11:38.392 "nvme_admin": false, 00:11:38.392 "nvme_io": false 00:11:38.392 }, 00:11:38.392 "memory_domains": [ 00:11:38.392 { 00:11:38.392 "dma_device_id": "system", 00:11:38.392 "dma_device_type": 1 00:11:38.392 }, 00:11:38.392 { 00:11:38.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.392 "dma_device_type": 2 00:11:38.392 } 00:11:38.392 ], 00:11:38.392 "driver_specific": {} 00:11:38.392 }' 00:11:38.392 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.392 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.653 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:38.653 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.653 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.653 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:38.653 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.653 13:39:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.653 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:38.653 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.653 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.913 "name": "BaseBdev2", 00:11:38.913 "aliases": [ 00:11:38.913 "8d095887-6e2e-4e8e-a275-87182915a61b" 00:11:38.913 ], 00:11:38.913 "product_name": "Malloc disk", 00:11:38.913 "block_size": 512, 00:11:38.913 "num_blocks": 65536, 00:11:38.913 "uuid": "8d095887-6e2e-4e8e-a275-87182915a61b", 00:11:38.913 "assigned_rate_limits": { 00:11:38.913 "rw_ios_per_sec": 0, 00:11:38.913 "rw_mbytes_per_sec": 0, 00:11:38.913 "r_mbytes_per_sec": 0, 00:11:38.913 "w_mbytes_per_sec": 0 00:11:38.913 }, 00:11:38.913 "claimed": true, 00:11:38.913 "claim_type": "exclusive_write", 00:11:38.913 "zoned": false, 00:11:38.913 "supported_io_types": { 00:11:38.913 "read": true, 00:11:38.913 "write": true, 00:11:38.913 "unmap": true, 00:11:38.913 "write_zeroes": true, 00:11:38.913 "flush": true, 00:11:38.913 "reset": true, 00:11:38.913 "compare": false, 00:11:38.913 "compare_and_write": false, 00:11:38.913 "abort": true, 00:11:38.913 "nvme_admin": false, 00:11:38.913 "nvme_io": false 00:11:38.913 }, 00:11:38.913 "memory_domains": [ 00:11:38.913 { 00:11:38.913 "dma_device_id": "system", 00:11:38.913 "dma_device_type": 1 00:11:38.913 }, 00:11:38.913 { 00:11:38.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.913 "dma_device_type": 2 00:11:38.913 } 00:11:38.913 ], 00:11:38.913 "driver_specific": {} 00:11:38.913 }' 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.913 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.174 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.433 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.433 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:39.433 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:39.433 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:39.433 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.433 "name": "BaseBdev3", 00:11:39.433 "aliases": [ 00:11:39.433 "e58e5506-8ef9-4010-8eee-296401d5d183" 00:11:39.433 ], 00:11:39.433 "product_name": "Malloc disk", 00:11:39.433 "block_size": 512, 00:11:39.433 "num_blocks": 65536, 00:11:39.433 "uuid": "e58e5506-8ef9-4010-8eee-296401d5d183", 00:11:39.433 "assigned_rate_limits": { 00:11:39.433 "rw_ios_per_sec": 0, 00:11:39.433 "rw_mbytes_per_sec": 0, 00:11:39.433 "r_mbytes_per_sec": 0, 00:11:39.433 "w_mbytes_per_sec": 0 00:11:39.433 }, 00:11:39.433 "claimed": true, 00:11:39.433 "claim_type": "exclusive_write", 00:11:39.433 "zoned": false, 00:11:39.433 "supported_io_types": { 00:11:39.433 "read": true, 00:11:39.433 "write": true, 00:11:39.433 "unmap": true, 00:11:39.433 "write_zeroes": true, 00:11:39.433 "flush": true, 00:11:39.433 "reset": true, 00:11:39.433 "compare": false, 00:11:39.433 "compare_and_write": false, 00:11:39.433 "abort": true, 00:11:39.433 "nvme_admin": false, 00:11:39.433 "nvme_io": false 00:11:39.433 }, 00:11:39.433 "memory_domains": [ 00:11:39.433 { 00:11:39.433 "dma_device_id": "system", 00:11:39.434 "dma_device_type": 1 00:11:39.434 }, 00:11:39.434 { 00:11:39.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.434 "dma_device_type": 2 00:11:39.434 } 00:11:39.434 ], 00:11:39.434 "driver_specific": {} 00:11:39.434 }' 00:11:39.434 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.693 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.693 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.693 13:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.693 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.693 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.693 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.693 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.693 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.693 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.954 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.954 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.954 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:39.954 [2024-06-10 13:39:54.421191] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:39.954 [2024-06-10 13:39:54.421206] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:39.954 [2024-06-10 13:39:54.421246] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:39.954 [2024-06-10 13:39:54.421286] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:39.954 [2024-06-10 13:39:54.421292] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1556090 name Existed_Raid, state offline 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1512911 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1512911 ']' 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1512911 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1512911 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1512911' 00:11:40.214 killing process with pid 1512911 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1512911 00:11:40.214 [2024-06-10 13:39:54.490613] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1512911 00:11:40.214 [2024-06-10 13:39:54.505959] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:40.214 00:11:40.214 real 0m24.323s 00:11:40.214 user 0m45.588s 00:11:40.214 sys 0m3.553s 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:40.214 13:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.214 ************************************ 00:11:40.214 END TEST raid_state_function_test 00:11:40.214 ************************************ 00:11:40.214 13:39:54 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:11:40.214 13:39:54 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:40.214 13:39:54 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:40.214 13:39:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:40.475 ************************************ 00:11:40.475 START TEST raid_state_function_test_sb 00:11:40.475 ************************************ 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 true 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1518277 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1518277' 00:11:40.475 Process raid pid: 1518277 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1518277 /var/tmp/spdk-raid.sock 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1518277 ']' 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:40.475 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:40.475 13:39:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:40.475 [2024-06-10 13:39:54.769941] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:11:40.475 [2024-06-10 13:39:54.769989] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:40.475 [2024-06-10 13:39:54.858968] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:40.475 [2024-06-10 13:39:54.925530] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:11:40.736 [2024-06-10 13:39:54.969202] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.736 [2024-06-10 13:39:54.969221] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.306 13:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:41.307 13:39:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:11:41.307 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:41.567 [2024-06-10 13:39:55.813171] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:41.567 [2024-06-10 13:39:55.813200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:41.567 [2024-06-10 13:39:55.813206] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:41.567 [2024-06-10 13:39:55.813213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:41.567 [2024-06-10 13:39:55.813217] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:41.567 [2024-06-10 13:39:55.813223] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.567 13:39:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.567 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.567 "name": "Existed_Raid", 00:11:41.567 "uuid": "463067af-1051-43ed-9454-4432be100f0f", 00:11:41.567 "strip_size_kb": 64, 00:11:41.567 "state": "configuring", 00:11:41.567 "raid_level": "raid0", 00:11:41.567 "superblock": true, 00:11:41.567 "num_base_bdevs": 3, 00:11:41.567 "num_base_bdevs_discovered": 0, 00:11:41.567 "num_base_bdevs_operational": 3, 00:11:41.567 "base_bdevs_list": [ 00:11:41.567 { 00:11:41.567 "name": "BaseBdev1", 00:11:41.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.567 "is_configured": false, 00:11:41.568 "data_offset": 0, 00:11:41.568 "data_size": 0 00:11:41.568 }, 00:11:41.568 { 00:11:41.568 "name": "BaseBdev2", 00:11:41.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.568 "is_configured": false, 00:11:41.568 "data_offset": 0, 00:11:41.568 "data_size": 0 00:11:41.568 }, 00:11:41.568 { 00:11:41.568 "name": "BaseBdev3", 00:11:41.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.568 "is_configured": false, 00:11:41.568 "data_offset": 0, 00:11:41.568 "data_size": 0 00:11:41.568 } 00:11:41.568 ] 00:11:41.568 }' 00:11:41.568 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.568 13:39:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:42.139 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:42.398 [2024-06-10 13:39:56.763453] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:42.398 [2024-06-10 13:39:56.763470] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac4740 name Existed_Raid, state configuring 00:11:42.398 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:42.658 [2024-06-10 13:39:56.963981] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:42.658 [2024-06-10 13:39:56.963997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:42.658 [2024-06-10 13:39:56.964003] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:42.658 [2024-06-10 13:39:56.964009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:42.658 [2024-06-10 13:39:56.964014] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:42.658 [2024-06-10 13:39:56.964019] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:42.658 13:39:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:42.918 [2024-06-10 13:39:57.171420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:42.918 BaseBdev1 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:42.918 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:43.179 [ 00:11:43.179 { 00:11:43.179 "name": "BaseBdev1", 00:11:43.179 "aliases": [ 00:11:43.179 "ce53dc7c-1071-4302-b96e-7e589aa7df20" 00:11:43.179 ], 00:11:43.179 "product_name": "Malloc disk", 00:11:43.179 "block_size": 512, 00:11:43.179 "num_blocks": 65536, 00:11:43.179 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:43.179 "assigned_rate_limits": { 00:11:43.179 "rw_ios_per_sec": 0, 00:11:43.179 "rw_mbytes_per_sec": 0, 00:11:43.179 "r_mbytes_per_sec": 0, 00:11:43.179 "w_mbytes_per_sec": 0 00:11:43.179 }, 00:11:43.179 "claimed": true, 00:11:43.179 "claim_type": "exclusive_write", 00:11:43.179 "zoned": false, 00:11:43.179 "supported_io_types": { 00:11:43.179 "read": true, 00:11:43.179 "write": true, 00:11:43.179 "unmap": true, 00:11:43.179 "write_zeroes": true, 00:11:43.179 "flush": true, 00:11:43.179 "reset": true, 00:11:43.179 "compare": false, 00:11:43.179 "compare_and_write": false, 00:11:43.179 "abort": true, 00:11:43.179 "nvme_admin": false, 00:11:43.179 "nvme_io": false 00:11:43.179 }, 00:11:43.179 "memory_domains": [ 00:11:43.179 { 00:11:43.179 "dma_device_id": "system", 00:11:43.179 "dma_device_type": 1 00:11:43.179 }, 00:11:43.179 { 00:11:43.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.179 "dma_device_type": 2 00:11:43.179 } 00:11:43.179 ], 00:11:43.179 "driver_specific": {} 00:11:43.179 } 00:11:43.179 ] 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.179 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.440 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.440 "name": "Existed_Raid", 00:11:43.440 "uuid": "73fbf507-00d5-4822-a346-41b6aa9f166f", 00:11:43.440 "strip_size_kb": 64, 00:11:43.440 "state": "configuring", 00:11:43.440 "raid_level": "raid0", 00:11:43.440 "superblock": true, 00:11:43.440 "num_base_bdevs": 3, 00:11:43.440 "num_base_bdevs_discovered": 1, 00:11:43.440 "num_base_bdevs_operational": 3, 00:11:43.440 "base_bdevs_list": [ 00:11:43.440 { 00:11:43.440 "name": "BaseBdev1", 00:11:43.440 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:43.440 "is_configured": true, 00:11:43.440 "data_offset": 2048, 00:11:43.440 "data_size": 63488 00:11:43.440 }, 00:11:43.440 { 00:11:43.440 "name": "BaseBdev2", 00:11:43.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.440 "is_configured": false, 00:11:43.440 "data_offset": 0, 00:11:43.440 "data_size": 0 00:11:43.440 }, 00:11:43.440 { 00:11:43.440 "name": "BaseBdev3", 00:11:43.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.440 "is_configured": false, 00:11:43.440 "data_offset": 0, 00:11:43.440 "data_size": 0 00:11:43.440 } 00:11:43.440 ] 00:11:43.440 }' 00:11:43.440 13:39:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.440 13:39:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.011 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:44.271 [2024-06-10 13:39:58.526851] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:44.271 [2024-06-10 13:39:58.526877] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac4010 name Existed_Raid, state configuring 00:11:44.271 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:44.272 [2024-06-10 13:39:58.731405] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:44.272 [2024-06-10 13:39:58.732621] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:44.272 [2024-06-10 13:39:58.732656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:44.272 [2024-06-10 13:39:58.732662] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:44.272 [2024-06-10 13:39:58.732669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.533 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.533 "name": "Existed_Raid", 00:11:44.533 "uuid": "e13d46df-9a64-4262-ba9b-bf9c32d5d623", 00:11:44.533 "strip_size_kb": 64, 00:11:44.533 "state": "configuring", 00:11:44.533 "raid_level": "raid0", 00:11:44.533 "superblock": true, 00:11:44.533 "num_base_bdevs": 3, 00:11:44.534 "num_base_bdevs_discovered": 1, 00:11:44.534 "num_base_bdevs_operational": 3, 00:11:44.534 "base_bdevs_list": [ 00:11:44.534 { 00:11:44.534 "name": "BaseBdev1", 00:11:44.534 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:44.534 "is_configured": true, 00:11:44.534 "data_offset": 2048, 00:11:44.534 "data_size": 63488 00:11:44.534 }, 00:11:44.534 { 00:11:44.534 "name": "BaseBdev2", 00:11:44.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.534 "is_configured": false, 00:11:44.534 "data_offset": 0, 00:11:44.534 "data_size": 0 00:11:44.534 }, 00:11:44.534 { 00:11:44.534 "name": "BaseBdev3", 00:11:44.534 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.534 "is_configured": false, 00:11:44.534 "data_offset": 0, 00:11:44.534 "data_size": 0 00:11:44.534 } 00:11:44.534 ] 00:11:44.534 }' 00:11:44.534 13:39:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.534 13:39:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:45.105 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:45.365 [2024-06-10 13:39:59.626748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:45.365 BaseBdev2 00:11:45.365 13:39:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:45.365 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:45.365 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:45.365 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:45.365 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:45.365 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:45.366 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:45.627 13:39:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:45.627 [ 00:11:45.627 { 00:11:45.627 "name": "BaseBdev2", 00:11:45.627 "aliases": [ 00:11:45.627 "95e07939-15a3-49c1-a9d3-33d97caf506c" 00:11:45.627 ], 00:11:45.627 "product_name": "Malloc disk", 00:11:45.627 "block_size": 512, 00:11:45.627 "num_blocks": 65536, 00:11:45.627 "uuid": "95e07939-15a3-49c1-a9d3-33d97caf506c", 00:11:45.627 "assigned_rate_limits": { 00:11:45.627 "rw_ios_per_sec": 0, 00:11:45.627 "rw_mbytes_per_sec": 0, 00:11:45.627 "r_mbytes_per_sec": 0, 00:11:45.627 "w_mbytes_per_sec": 0 00:11:45.627 }, 00:11:45.627 "claimed": true, 00:11:45.627 "claim_type": "exclusive_write", 00:11:45.627 "zoned": false, 00:11:45.627 "supported_io_types": { 00:11:45.627 "read": true, 00:11:45.627 "write": true, 00:11:45.627 "unmap": true, 00:11:45.627 "write_zeroes": true, 00:11:45.627 "flush": true, 00:11:45.627 "reset": true, 00:11:45.627 "compare": false, 00:11:45.627 "compare_and_write": false, 00:11:45.627 "abort": true, 00:11:45.627 "nvme_admin": false, 00:11:45.627 "nvme_io": false 00:11:45.627 }, 00:11:45.627 "memory_domains": [ 00:11:45.627 { 00:11:45.627 "dma_device_id": "system", 00:11:45.627 "dma_device_type": 1 00:11:45.627 }, 00:11:45.627 { 00:11:45.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.627 "dma_device_type": 2 00:11:45.627 } 00:11:45.627 ], 00:11:45.627 "driver_specific": {} 00:11:45.627 } 00:11:45.627 ] 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.627 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.887 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.887 "name": "Existed_Raid", 00:11:45.887 "uuid": "e13d46df-9a64-4262-ba9b-bf9c32d5d623", 00:11:45.887 "strip_size_kb": 64, 00:11:45.887 "state": "configuring", 00:11:45.887 "raid_level": "raid0", 00:11:45.887 "superblock": true, 00:11:45.887 "num_base_bdevs": 3, 00:11:45.887 "num_base_bdevs_discovered": 2, 00:11:45.887 "num_base_bdevs_operational": 3, 00:11:45.887 "base_bdevs_list": [ 00:11:45.887 { 00:11:45.887 "name": "BaseBdev1", 00:11:45.887 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:45.887 "is_configured": true, 00:11:45.887 "data_offset": 2048, 00:11:45.887 "data_size": 63488 00:11:45.887 }, 00:11:45.887 { 00:11:45.887 "name": "BaseBdev2", 00:11:45.887 "uuid": "95e07939-15a3-49c1-a9d3-33d97caf506c", 00:11:45.887 "is_configured": true, 00:11:45.887 "data_offset": 2048, 00:11:45.887 "data_size": 63488 00:11:45.887 }, 00:11:45.887 { 00:11:45.887 "name": "BaseBdev3", 00:11:45.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.887 "is_configured": false, 00:11:45.887 "data_offset": 0, 00:11:45.887 "data_size": 0 00:11:45.887 } 00:11:45.887 ] 00:11:45.887 }' 00:11:45.887 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.887 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:46.457 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:46.716 [2024-06-10 13:40:00.971254] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:46.717 [2024-06-10 13:40:00.971371] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xac4f00 00:11:46.717 [2024-06-10 13:40:00.971380] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:46.717 [2024-06-10 13:40:00.971528] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xadbdf0 00:11:46.717 [2024-06-10 13:40:00.971623] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xac4f00 00:11:46.717 [2024-06-10 13:40:00.971629] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xac4f00 00:11:46.717 [2024-06-10 13:40:00.971706] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:46.717 BaseBdev3 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:46.717 13:40:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.976 13:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:46.976 [ 00:11:46.977 { 00:11:46.977 "name": "BaseBdev3", 00:11:46.977 "aliases": [ 00:11:46.977 "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b" 00:11:46.977 ], 00:11:46.977 "product_name": "Malloc disk", 00:11:46.977 "block_size": 512, 00:11:46.977 "num_blocks": 65536, 00:11:46.977 "uuid": "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b", 00:11:46.977 "assigned_rate_limits": { 00:11:46.977 "rw_ios_per_sec": 0, 00:11:46.977 "rw_mbytes_per_sec": 0, 00:11:46.977 "r_mbytes_per_sec": 0, 00:11:46.977 "w_mbytes_per_sec": 0 00:11:46.977 }, 00:11:46.977 "claimed": true, 00:11:46.977 "claim_type": "exclusive_write", 00:11:46.977 "zoned": false, 00:11:46.977 "supported_io_types": { 00:11:46.977 "read": true, 00:11:46.977 "write": true, 00:11:46.977 "unmap": true, 00:11:46.977 "write_zeroes": true, 00:11:46.977 "flush": true, 00:11:46.977 "reset": true, 00:11:46.977 "compare": false, 00:11:46.977 "compare_and_write": false, 00:11:46.977 "abort": true, 00:11:46.977 "nvme_admin": false, 00:11:46.977 "nvme_io": false 00:11:46.977 }, 00:11:46.977 "memory_domains": [ 00:11:46.977 { 00:11:46.977 "dma_device_id": "system", 00:11:46.977 "dma_device_type": 1 00:11:46.977 }, 00:11:46.977 { 00:11:46.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.977 "dma_device_type": 2 00:11:46.977 } 00:11:46.977 ], 00:11:46.977 "driver_specific": {} 00:11:46.977 } 00:11:46.977 ] 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.977 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.238 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.238 "name": "Existed_Raid", 00:11:47.238 "uuid": "e13d46df-9a64-4262-ba9b-bf9c32d5d623", 00:11:47.238 "strip_size_kb": 64, 00:11:47.238 "state": "online", 00:11:47.238 "raid_level": "raid0", 00:11:47.238 "superblock": true, 00:11:47.238 "num_base_bdevs": 3, 00:11:47.238 "num_base_bdevs_discovered": 3, 00:11:47.238 "num_base_bdevs_operational": 3, 00:11:47.238 "base_bdevs_list": [ 00:11:47.238 { 00:11:47.238 "name": "BaseBdev1", 00:11:47.238 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:47.238 "is_configured": true, 00:11:47.238 "data_offset": 2048, 00:11:47.238 "data_size": 63488 00:11:47.238 }, 00:11:47.238 { 00:11:47.238 "name": "BaseBdev2", 00:11:47.238 "uuid": "95e07939-15a3-49c1-a9d3-33d97caf506c", 00:11:47.238 "is_configured": true, 00:11:47.238 "data_offset": 2048, 00:11:47.238 "data_size": 63488 00:11:47.238 }, 00:11:47.238 { 00:11:47.238 "name": "BaseBdev3", 00:11:47.238 "uuid": "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b", 00:11:47.238 "is_configured": true, 00:11:47.238 "data_offset": 2048, 00:11:47.238 "data_size": 63488 00:11:47.238 } 00:11:47.238 ] 00:11:47.238 }' 00:11:47.238 13:40:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.238 13:40:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:47.808 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:47.808 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:47.808 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:47.809 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:47.809 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:47.809 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:47.809 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:47.809 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:48.068 [2024-06-10 13:40:02.318932] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:48.068 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:48.068 "name": "Existed_Raid", 00:11:48.068 "aliases": [ 00:11:48.068 "e13d46df-9a64-4262-ba9b-bf9c32d5d623" 00:11:48.068 ], 00:11:48.068 "product_name": "Raid Volume", 00:11:48.068 "block_size": 512, 00:11:48.068 "num_blocks": 190464, 00:11:48.068 "uuid": "e13d46df-9a64-4262-ba9b-bf9c32d5d623", 00:11:48.068 "assigned_rate_limits": { 00:11:48.068 "rw_ios_per_sec": 0, 00:11:48.068 "rw_mbytes_per_sec": 0, 00:11:48.068 "r_mbytes_per_sec": 0, 00:11:48.068 "w_mbytes_per_sec": 0 00:11:48.068 }, 00:11:48.068 "claimed": false, 00:11:48.068 "zoned": false, 00:11:48.068 "supported_io_types": { 00:11:48.068 "read": true, 00:11:48.068 "write": true, 00:11:48.068 "unmap": true, 00:11:48.068 "write_zeroes": true, 00:11:48.068 "flush": true, 00:11:48.068 "reset": true, 00:11:48.068 "compare": false, 00:11:48.068 "compare_and_write": false, 00:11:48.068 "abort": false, 00:11:48.068 "nvme_admin": false, 00:11:48.068 "nvme_io": false 00:11:48.068 }, 00:11:48.068 "memory_domains": [ 00:11:48.068 { 00:11:48.068 "dma_device_id": "system", 00:11:48.068 "dma_device_type": 1 00:11:48.068 }, 00:11:48.068 { 00:11:48.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.068 "dma_device_type": 2 00:11:48.068 }, 00:11:48.068 { 00:11:48.068 "dma_device_id": "system", 00:11:48.068 "dma_device_type": 1 00:11:48.068 }, 00:11:48.068 { 00:11:48.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.068 "dma_device_type": 2 00:11:48.068 }, 00:11:48.068 { 00:11:48.068 "dma_device_id": "system", 00:11:48.068 "dma_device_type": 1 00:11:48.068 }, 00:11:48.068 { 00:11:48.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.068 "dma_device_type": 2 00:11:48.068 } 00:11:48.068 ], 00:11:48.068 "driver_specific": { 00:11:48.068 "raid": { 00:11:48.068 "uuid": "e13d46df-9a64-4262-ba9b-bf9c32d5d623", 00:11:48.068 "strip_size_kb": 64, 00:11:48.068 "state": "online", 00:11:48.068 "raid_level": "raid0", 00:11:48.068 "superblock": true, 00:11:48.068 "num_base_bdevs": 3, 00:11:48.068 "num_base_bdevs_discovered": 3, 00:11:48.069 "num_base_bdevs_operational": 3, 00:11:48.069 "base_bdevs_list": [ 00:11:48.069 { 00:11:48.069 "name": "BaseBdev1", 00:11:48.069 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:48.069 "is_configured": true, 00:11:48.069 "data_offset": 2048, 00:11:48.069 "data_size": 63488 00:11:48.069 }, 00:11:48.069 { 00:11:48.069 "name": "BaseBdev2", 00:11:48.069 "uuid": "95e07939-15a3-49c1-a9d3-33d97caf506c", 00:11:48.069 "is_configured": true, 00:11:48.069 "data_offset": 2048, 00:11:48.069 "data_size": 63488 00:11:48.069 }, 00:11:48.069 { 00:11:48.069 "name": "BaseBdev3", 00:11:48.069 "uuid": "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b", 00:11:48.069 "is_configured": true, 00:11:48.069 "data_offset": 2048, 00:11:48.069 "data_size": 63488 00:11:48.069 } 00:11:48.069 ] 00:11:48.069 } 00:11:48.069 } 00:11:48.069 }' 00:11:48.069 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:48.069 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:48.069 BaseBdev2 00:11:48.069 BaseBdev3' 00:11:48.069 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:48.069 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:48.069 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:48.329 "name": "BaseBdev1", 00:11:48.329 "aliases": [ 00:11:48.329 "ce53dc7c-1071-4302-b96e-7e589aa7df20" 00:11:48.329 ], 00:11:48.329 "product_name": "Malloc disk", 00:11:48.329 "block_size": 512, 00:11:48.329 "num_blocks": 65536, 00:11:48.329 "uuid": "ce53dc7c-1071-4302-b96e-7e589aa7df20", 00:11:48.329 "assigned_rate_limits": { 00:11:48.329 "rw_ios_per_sec": 0, 00:11:48.329 "rw_mbytes_per_sec": 0, 00:11:48.329 "r_mbytes_per_sec": 0, 00:11:48.329 "w_mbytes_per_sec": 0 00:11:48.329 }, 00:11:48.329 "claimed": true, 00:11:48.329 "claim_type": "exclusive_write", 00:11:48.329 "zoned": false, 00:11:48.329 "supported_io_types": { 00:11:48.329 "read": true, 00:11:48.329 "write": true, 00:11:48.329 "unmap": true, 00:11:48.329 "write_zeroes": true, 00:11:48.329 "flush": true, 00:11:48.329 "reset": true, 00:11:48.329 "compare": false, 00:11:48.329 "compare_and_write": false, 00:11:48.329 "abort": true, 00:11:48.329 "nvme_admin": false, 00:11:48.329 "nvme_io": false 00:11:48.329 }, 00:11:48.329 "memory_domains": [ 00:11:48.329 { 00:11:48.329 "dma_device_id": "system", 00:11:48.329 "dma_device_type": 1 00:11:48.329 }, 00:11:48.329 { 00:11:48.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.329 "dma_device_type": 2 00:11:48.329 } 00:11:48.329 ], 00:11:48.329 "driver_specific": {} 00:11:48.329 }' 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:48.329 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:48.590 13:40:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:48.851 "name": "BaseBdev2", 00:11:48.851 "aliases": [ 00:11:48.851 "95e07939-15a3-49c1-a9d3-33d97caf506c" 00:11:48.851 ], 00:11:48.851 "product_name": "Malloc disk", 00:11:48.851 "block_size": 512, 00:11:48.851 "num_blocks": 65536, 00:11:48.851 "uuid": "95e07939-15a3-49c1-a9d3-33d97caf506c", 00:11:48.851 "assigned_rate_limits": { 00:11:48.851 "rw_ios_per_sec": 0, 00:11:48.851 "rw_mbytes_per_sec": 0, 00:11:48.851 "r_mbytes_per_sec": 0, 00:11:48.851 "w_mbytes_per_sec": 0 00:11:48.851 }, 00:11:48.851 "claimed": true, 00:11:48.851 "claim_type": "exclusive_write", 00:11:48.851 "zoned": false, 00:11:48.851 "supported_io_types": { 00:11:48.851 "read": true, 00:11:48.851 "write": true, 00:11:48.851 "unmap": true, 00:11:48.851 "write_zeroes": true, 00:11:48.851 "flush": true, 00:11:48.851 "reset": true, 00:11:48.851 "compare": false, 00:11:48.851 "compare_and_write": false, 00:11:48.851 "abort": true, 00:11:48.851 "nvme_admin": false, 00:11:48.851 "nvme_io": false 00:11:48.851 }, 00:11:48.851 "memory_domains": [ 00:11:48.851 { 00:11:48.851 "dma_device_id": "system", 00:11:48.851 "dma_device_type": 1 00:11:48.851 }, 00:11:48.851 { 00:11:48.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.851 "dma_device_type": 2 00:11:48.851 } 00:11:48.851 ], 00:11:48.851 "driver_specific": {} 00:11:48.851 }' 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:48.851 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:49.112 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:49.373 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:49.373 "name": "BaseBdev3", 00:11:49.373 "aliases": [ 00:11:49.373 "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b" 00:11:49.373 ], 00:11:49.373 "product_name": "Malloc disk", 00:11:49.373 "block_size": 512, 00:11:49.373 "num_blocks": 65536, 00:11:49.373 "uuid": "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b", 00:11:49.373 "assigned_rate_limits": { 00:11:49.373 "rw_ios_per_sec": 0, 00:11:49.373 "rw_mbytes_per_sec": 0, 00:11:49.373 "r_mbytes_per_sec": 0, 00:11:49.373 "w_mbytes_per_sec": 0 00:11:49.373 }, 00:11:49.373 "claimed": true, 00:11:49.373 "claim_type": "exclusive_write", 00:11:49.373 "zoned": false, 00:11:49.373 "supported_io_types": { 00:11:49.373 "read": true, 00:11:49.373 "write": true, 00:11:49.373 "unmap": true, 00:11:49.373 "write_zeroes": true, 00:11:49.373 "flush": true, 00:11:49.373 "reset": true, 00:11:49.373 "compare": false, 00:11:49.373 "compare_and_write": false, 00:11:49.373 "abort": true, 00:11:49.373 "nvme_admin": false, 00:11:49.373 "nvme_io": false 00:11:49.373 }, 00:11:49.373 "memory_domains": [ 00:11:49.373 { 00:11:49.373 "dma_device_id": "system", 00:11:49.373 "dma_device_type": 1 00:11:49.373 }, 00:11:49.373 { 00:11:49.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.373 "dma_device_type": 2 00:11:49.373 } 00:11:49.373 ], 00:11:49.373 "driver_specific": {} 00:11:49.373 }' 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:49.374 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:49.635 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:49.635 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:49.635 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:49.635 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:49.635 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:49.635 13:40:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:49.896 [2024-06-10 13:40:04.175463] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:49.896 [2024-06-10 13:40:04.175480] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:49.896 [2024-06-10 13:40:04.175512] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.896 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.200 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.200 "name": "Existed_Raid", 00:11:50.200 "uuid": "e13d46df-9a64-4262-ba9b-bf9c32d5d623", 00:11:50.200 "strip_size_kb": 64, 00:11:50.200 "state": "offline", 00:11:50.200 "raid_level": "raid0", 00:11:50.200 "superblock": true, 00:11:50.200 "num_base_bdevs": 3, 00:11:50.200 "num_base_bdevs_discovered": 2, 00:11:50.200 "num_base_bdevs_operational": 2, 00:11:50.200 "base_bdevs_list": [ 00:11:50.200 { 00:11:50.200 "name": null, 00:11:50.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.200 "is_configured": false, 00:11:50.200 "data_offset": 2048, 00:11:50.200 "data_size": 63488 00:11:50.200 }, 00:11:50.200 { 00:11:50.200 "name": "BaseBdev2", 00:11:50.200 "uuid": "95e07939-15a3-49c1-a9d3-33d97caf506c", 00:11:50.200 "is_configured": true, 00:11:50.200 "data_offset": 2048, 00:11:50.200 "data_size": 63488 00:11:50.200 }, 00:11:50.200 { 00:11:50.200 "name": "BaseBdev3", 00:11:50.200 "uuid": "f563209c-1c2d-4ee0-9cae-7e54c0b52a6b", 00:11:50.200 "is_configured": true, 00:11:50.200 "data_offset": 2048, 00:11:50.200 "data_size": 63488 00:11:50.200 } 00:11:50.200 ] 00:11:50.200 }' 00:11:50.200 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.200 13:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.496 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:50.496 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:50.496 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:50.496 13:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.783 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:50.783 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:50.783 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:51.043 [2024-06-10 13:40:05.334435] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:51.043 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:51.043 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:51.044 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.044 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:51.305 [2024-06-10 13:40:05.745534] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:51.305 [2024-06-10 13:40:05.745565] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac4f00 name Existed_Raid, state offline 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:51.305 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:51.565 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:51.565 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:51.565 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:51.565 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:51.566 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:51.566 13:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:51.827 BaseBdev2 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:51.827 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:52.089 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:52.089 [ 00:11:52.089 { 00:11:52.089 "name": "BaseBdev2", 00:11:52.089 "aliases": [ 00:11:52.089 "68becb05-af9b-4c8b-ace5-0ec546436447" 00:11:52.089 ], 00:11:52.089 "product_name": "Malloc disk", 00:11:52.089 "block_size": 512, 00:11:52.089 "num_blocks": 65536, 00:11:52.089 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:52.089 "assigned_rate_limits": { 00:11:52.089 "rw_ios_per_sec": 0, 00:11:52.089 "rw_mbytes_per_sec": 0, 00:11:52.089 "r_mbytes_per_sec": 0, 00:11:52.089 "w_mbytes_per_sec": 0 00:11:52.089 }, 00:11:52.089 "claimed": false, 00:11:52.089 "zoned": false, 00:11:52.089 "supported_io_types": { 00:11:52.089 "read": true, 00:11:52.089 "write": true, 00:11:52.089 "unmap": true, 00:11:52.089 "write_zeroes": true, 00:11:52.089 "flush": true, 00:11:52.089 "reset": true, 00:11:52.089 "compare": false, 00:11:52.089 "compare_and_write": false, 00:11:52.089 "abort": true, 00:11:52.089 "nvme_admin": false, 00:11:52.089 "nvme_io": false 00:11:52.089 }, 00:11:52.089 "memory_domains": [ 00:11:52.089 { 00:11:52.089 "dma_device_id": "system", 00:11:52.089 "dma_device_type": 1 00:11:52.089 }, 00:11:52.089 { 00:11:52.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.089 "dma_device_type": 2 00:11:52.089 } 00:11:52.089 ], 00:11:52.089 "driver_specific": {} 00:11:52.089 } 00:11:52.089 ] 00:11:52.089 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:52.089 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:52.089 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:52.089 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:52.350 BaseBdev3 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:52.350 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:52.611 13:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:52.871 [ 00:11:52.871 { 00:11:52.871 "name": "BaseBdev3", 00:11:52.871 "aliases": [ 00:11:52.871 "04708b85-0561-48df-bfcb-d0e14d66346c" 00:11:52.871 ], 00:11:52.871 "product_name": "Malloc disk", 00:11:52.871 "block_size": 512, 00:11:52.871 "num_blocks": 65536, 00:11:52.871 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:52.871 "assigned_rate_limits": { 00:11:52.871 "rw_ios_per_sec": 0, 00:11:52.871 "rw_mbytes_per_sec": 0, 00:11:52.871 "r_mbytes_per_sec": 0, 00:11:52.871 "w_mbytes_per_sec": 0 00:11:52.871 }, 00:11:52.871 "claimed": false, 00:11:52.871 "zoned": false, 00:11:52.871 "supported_io_types": { 00:11:52.871 "read": true, 00:11:52.871 "write": true, 00:11:52.871 "unmap": true, 00:11:52.871 "write_zeroes": true, 00:11:52.871 "flush": true, 00:11:52.871 "reset": true, 00:11:52.871 "compare": false, 00:11:52.871 "compare_and_write": false, 00:11:52.871 "abort": true, 00:11:52.871 "nvme_admin": false, 00:11:52.871 "nvme_io": false 00:11:52.871 }, 00:11:52.871 "memory_domains": [ 00:11:52.871 { 00:11:52.871 "dma_device_id": "system", 00:11:52.871 "dma_device_type": 1 00:11:52.871 }, 00:11:52.871 { 00:11:52.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.871 "dma_device_type": 2 00:11:52.871 } 00:11:52.871 ], 00:11:52.871 "driver_specific": {} 00:11:52.871 } 00:11:52.871 ] 00:11:52.871 13:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:52.871 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:52.871 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:52.871 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:52.871 [2024-06-10 13:40:07.337790] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:52.871 [2024-06-10 13:40:07.337817] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:52.871 [2024-06-10 13:40:07.337829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:52.871 [2024-06-10 13:40:07.338921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.133 "name": "Existed_Raid", 00:11:53.133 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:11:53.133 "strip_size_kb": 64, 00:11:53.133 "state": "configuring", 00:11:53.133 "raid_level": "raid0", 00:11:53.133 "superblock": true, 00:11:53.133 "num_base_bdevs": 3, 00:11:53.133 "num_base_bdevs_discovered": 2, 00:11:53.133 "num_base_bdevs_operational": 3, 00:11:53.133 "base_bdevs_list": [ 00:11:53.133 { 00:11:53.133 "name": "BaseBdev1", 00:11:53.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.133 "is_configured": false, 00:11:53.133 "data_offset": 0, 00:11:53.133 "data_size": 0 00:11:53.133 }, 00:11:53.133 { 00:11:53.133 "name": "BaseBdev2", 00:11:53.133 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:53.133 "is_configured": true, 00:11:53.133 "data_offset": 2048, 00:11:53.133 "data_size": 63488 00:11:53.133 }, 00:11:53.133 { 00:11:53.133 "name": "BaseBdev3", 00:11:53.133 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:53.133 "is_configured": true, 00:11:53.133 "data_offset": 2048, 00:11:53.133 "data_size": 63488 00:11:53.133 } 00:11:53.133 ] 00:11:53.133 }' 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.133 13:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.705 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:53.966 [2024-06-10 13:40:08.264129] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.966 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.227 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.227 "name": "Existed_Raid", 00:11:54.227 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:11:54.227 "strip_size_kb": 64, 00:11:54.227 "state": "configuring", 00:11:54.227 "raid_level": "raid0", 00:11:54.227 "superblock": true, 00:11:54.227 "num_base_bdevs": 3, 00:11:54.227 "num_base_bdevs_discovered": 1, 00:11:54.227 "num_base_bdevs_operational": 3, 00:11:54.227 "base_bdevs_list": [ 00:11:54.227 { 00:11:54.227 "name": "BaseBdev1", 00:11:54.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.227 "is_configured": false, 00:11:54.227 "data_offset": 0, 00:11:54.227 "data_size": 0 00:11:54.227 }, 00:11:54.227 { 00:11:54.227 "name": null, 00:11:54.227 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:54.227 "is_configured": false, 00:11:54.227 "data_offset": 2048, 00:11:54.227 "data_size": 63488 00:11:54.227 }, 00:11:54.227 { 00:11:54.227 "name": "BaseBdev3", 00:11:54.227 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:54.227 "is_configured": true, 00:11:54.227 "data_offset": 2048, 00:11:54.227 "data_size": 63488 00:11:54.227 } 00:11:54.227 ] 00:11:54.227 }' 00:11:54.227 13:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.227 13:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.799 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:54.799 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.799 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:54.799 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:55.059 [2024-06-10 13:40:09.440242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:55.059 BaseBdev1 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:55.059 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:55.417 [ 00:11:55.417 { 00:11:55.417 "name": "BaseBdev1", 00:11:55.417 "aliases": [ 00:11:55.417 "75cfcd1b-4f26-4d07-a2f0-13890b666265" 00:11:55.417 ], 00:11:55.417 "product_name": "Malloc disk", 00:11:55.417 "block_size": 512, 00:11:55.417 "num_blocks": 65536, 00:11:55.417 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:11:55.417 "assigned_rate_limits": { 00:11:55.417 "rw_ios_per_sec": 0, 00:11:55.417 "rw_mbytes_per_sec": 0, 00:11:55.417 "r_mbytes_per_sec": 0, 00:11:55.417 "w_mbytes_per_sec": 0 00:11:55.417 }, 00:11:55.417 "claimed": true, 00:11:55.417 "claim_type": "exclusive_write", 00:11:55.417 "zoned": false, 00:11:55.417 "supported_io_types": { 00:11:55.417 "read": true, 00:11:55.417 "write": true, 00:11:55.417 "unmap": true, 00:11:55.417 "write_zeroes": true, 00:11:55.417 "flush": true, 00:11:55.417 "reset": true, 00:11:55.417 "compare": false, 00:11:55.417 "compare_and_write": false, 00:11:55.417 "abort": true, 00:11:55.417 "nvme_admin": false, 00:11:55.417 "nvme_io": false 00:11:55.417 }, 00:11:55.417 "memory_domains": [ 00:11:55.417 { 00:11:55.417 "dma_device_id": "system", 00:11:55.417 "dma_device_type": 1 00:11:55.417 }, 00:11:55.417 { 00:11:55.417 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.417 "dma_device_type": 2 00:11:55.417 } 00:11:55.417 ], 00:11:55.417 "driver_specific": {} 00:11:55.417 } 00:11:55.417 ] 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.417 13:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.678 13:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.678 "name": "Existed_Raid", 00:11:55.678 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:11:55.678 "strip_size_kb": 64, 00:11:55.678 "state": "configuring", 00:11:55.678 "raid_level": "raid0", 00:11:55.678 "superblock": true, 00:11:55.678 "num_base_bdevs": 3, 00:11:55.678 "num_base_bdevs_discovered": 2, 00:11:55.678 "num_base_bdevs_operational": 3, 00:11:55.678 "base_bdevs_list": [ 00:11:55.678 { 00:11:55.678 "name": "BaseBdev1", 00:11:55.679 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:11:55.679 "is_configured": true, 00:11:55.679 "data_offset": 2048, 00:11:55.679 "data_size": 63488 00:11:55.679 }, 00:11:55.679 { 00:11:55.679 "name": null, 00:11:55.679 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:55.679 "is_configured": false, 00:11:55.679 "data_offset": 2048, 00:11:55.679 "data_size": 63488 00:11:55.679 }, 00:11:55.679 { 00:11:55.679 "name": "BaseBdev3", 00:11:55.679 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:55.679 "is_configured": true, 00:11:55.679 "data_offset": 2048, 00:11:55.679 "data_size": 63488 00:11:55.679 } 00:11:55.679 ] 00:11:55.679 }' 00:11:55.679 13:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.679 13:40:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.249 13:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.249 13:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:56.510 13:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:56.510 13:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:56.771 [2024-06-10 13:40:11.032330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:56.771 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.032 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.032 "name": "Existed_Raid", 00:11:57.032 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:11:57.032 "strip_size_kb": 64, 00:11:57.032 "state": "configuring", 00:11:57.032 "raid_level": "raid0", 00:11:57.032 "superblock": true, 00:11:57.032 "num_base_bdevs": 3, 00:11:57.032 "num_base_bdevs_discovered": 1, 00:11:57.032 "num_base_bdevs_operational": 3, 00:11:57.032 "base_bdevs_list": [ 00:11:57.032 { 00:11:57.032 "name": "BaseBdev1", 00:11:57.032 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:11:57.032 "is_configured": true, 00:11:57.032 "data_offset": 2048, 00:11:57.032 "data_size": 63488 00:11:57.032 }, 00:11:57.032 { 00:11:57.032 "name": null, 00:11:57.032 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:57.032 "is_configured": false, 00:11:57.032 "data_offset": 2048, 00:11:57.032 "data_size": 63488 00:11:57.032 }, 00:11:57.032 { 00:11:57.032 "name": null, 00:11:57.032 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:57.032 "is_configured": false, 00:11:57.032 "data_offset": 2048, 00:11:57.032 "data_size": 63488 00:11:57.032 } 00:11:57.032 ] 00:11:57.032 }' 00:11:57.032 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.032 13:40:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:57.603 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.603 13:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:57.603 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:57.603 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:57.868 [2024-06-10 13:40:12.207321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.868 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.869 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.869 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.869 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.131 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.131 "name": "Existed_Raid", 00:11:58.131 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:11:58.131 "strip_size_kb": 64, 00:11:58.131 "state": "configuring", 00:11:58.131 "raid_level": "raid0", 00:11:58.131 "superblock": true, 00:11:58.131 "num_base_bdevs": 3, 00:11:58.131 "num_base_bdevs_discovered": 2, 00:11:58.131 "num_base_bdevs_operational": 3, 00:11:58.131 "base_bdevs_list": [ 00:11:58.131 { 00:11:58.131 "name": "BaseBdev1", 00:11:58.131 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:11:58.131 "is_configured": true, 00:11:58.131 "data_offset": 2048, 00:11:58.131 "data_size": 63488 00:11:58.131 }, 00:11:58.131 { 00:11:58.131 "name": null, 00:11:58.131 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:58.131 "is_configured": false, 00:11:58.131 "data_offset": 2048, 00:11:58.131 "data_size": 63488 00:11:58.131 }, 00:11:58.131 { 00:11:58.131 "name": "BaseBdev3", 00:11:58.131 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:58.131 "is_configured": true, 00:11:58.131 "data_offset": 2048, 00:11:58.131 "data_size": 63488 00:11:58.131 } 00:11:58.131 ] 00:11:58.131 }' 00:11:58.131 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.131 13:40:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:58.702 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.702 13:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:58.702 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:58.702 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:58.962 [2024-06-10 13:40:13.350251] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.962 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:59.223 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.223 "name": "Existed_Raid", 00:11:59.223 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:11:59.223 "strip_size_kb": 64, 00:11:59.223 "state": "configuring", 00:11:59.223 "raid_level": "raid0", 00:11:59.223 "superblock": true, 00:11:59.223 "num_base_bdevs": 3, 00:11:59.223 "num_base_bdevs_discovered": 1, 00:11:59.223 "num_base_bdevs_operational": 3, 00:11:59.223 "base_bdevs_list": [ 00:11:59.223 { 00:11:59.223 "name": null, 00:11:59.223 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:11:59.223 "is_configured": false, 00:11:59.223 "data_offset": 2048, 00:11:59.223 "data_size": 63488 00:11:59.223 }, 00:11:59.223 { 00:11:59.223 "name": null, 00:11:59.223 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:11:59.223 "is_configured": false, 00:11:59.223 "data_offset": 2048, 00:11:59.223 "data_size": 63488 00:11:59.223 }, 00:11:59.223 { 00:11:59.223 "name": "BaseBdev3", 00:11:59.223 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:11:59.223 "is_configured": true, 00:11:59.223 "data_offset": 2048, 00:11:59.223 "data_size": 63488 00:11:59.223 } 00:11:59.223 ] 00:11:59.223 }' 00:11:59.223 13:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.223 13:40:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:59.793 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.793 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:00.053 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:00.053 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:00.053 [2024-06-10 13:40:14.527295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.313 "name": "Existed_Raid", 00:12:00.313 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:12:00.313 "strip_size_kb": 64, 00:12:00.313 "state": "configuring", 00:12:00.313 "raid_level": "raid0", 00:12:00.313 "superblock": true, 00:12:00.313 "num_base_bdevs": 3, 00:12:00.313 "num_base_bdevs_discovered": 2, 00:12:00.313 "num_base_bdevs_operational": 3, 00:12:00.313 "base_bdevs_list": [ 00:12:00.313 { 00:12:00.313 "name": null, 00:12:00.313 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:12:00.313 "is_configured": false, 00:12:00.313 "data_offset": 2048, 00:12:00.313 "data_size": 63488 00:12:00.313 }, 00:12:00.313 { 00:12:00.313 "name": "BaseBdev2", 00:12:00.313 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:12:00.313 "is_configured": true, 00:12:00.313 "data_offset": 2048, 00:12:00.313 "data_size": 63488 00:12:00.313 }, 00:12:00.313 { 00:12:00.313 "name": "BaseBdev3", 00:12:00.313 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:12:00.313 "is_configured": true, 00:12:00.313 "data_offset": 2048, 00:12:00.313 "data_size": 63488 00:12:00.313 } 00:12:00.313 ] 00:12:00.313 }' 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.313 13:40:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:00.883 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.883 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:01.143 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:01.143 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.143 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 75cfcd1b-4f26-4d07-a2f0-13890b666265 00:12:01.403 [2024-06-10 13:40:15.851807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:01.403 [2024-06-10 13:40:15.851916] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xac52f0 00:12:01.403 [2024-06-10 13:40:15.851924] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:01.403 [2024-06-10 13:40:15.852071] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xac4710 00:12:01.403 [2024-06-10 13:40:15.852170] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xac52f0 00:12:01.403 [2024-06-10 13:40:15.852178] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xac52f0 00:12:01.403 [2024-06-10 13:40:15.852272] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.403 NewBaseBdev 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:01.403 13:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:01.663 13:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:01.924 [ 00:12:01.924 { 00:12:01.924 "name": "NewBaseBdev", 00:12:01.924 "aliases": [ 00:12:01.924 "75cfcd1b-4f26-4d07-a2f0-13890b666265" 00:12:01.924 ], 00:12:01.924 "product_name": "Malloc disk", 00:12:01.924 "block_size": 512, 00:12:01.924 "num_blocks": 65536, 00:12:01.924 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:12:01.924 "assigned_rate_limits": { 00:12:01.924 "rw_ios_per_sec": 0, 00:12:01.924 "rw_mbytes_per_sec": 0, 00:12:01.924 "r_mbytes_per_sec": 0, 00:12:01.924 "w_mbytes_per_sec": 0 00:12:01.924 }, 00:12:01.924 "claimed": true, 00:12:01.924 "claim_type": "exclusive_write", 00:12:01.924 "zoned": false, 00:12:01.924 "supported_io_types": { 00:12:01.924 "read": true, 00:12:01.924 "write": true, 00:12:01.924 "unmap": true, 00:12:01.924 "write_zeroes": true, 00:12:01.924 "flush": true, 00:12:01.924 "reset": true, 00:12:01.924 "compare": false, 00:12:01.924 "compare_and_write": false, 00:12:01.924 "abort": true, 00:12:01.924 "nvme_admin": false, 00:12:01.924 "nvme_io": false 00:12:01.924 }, 00:12:01.924 "memory_domains": [ 00:12:01.924 { 00:12:01.924 "dma_device_id": "system", 00:12:01.924 "dma_device_type": 1 00:12:01.924 }, 00:12:01.924 { 00:12:01.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.924 "dma_device_type": 2 00:12:01.924 } 00:12:01.924 ], 00:12:01.924 "driver_specific": {} 00:12:01.924 } 00:12:01.924 ] 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.924 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.185 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.185 "name": "Existed_Raid", 00:12:02.185 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:12:02.185 "strip_size_kb": 64, 00:12:02.185 "state": "online", 00:12:02.185 "raid_level": "raid0", 00:12:02.185 "superblock": true, 00:12:02.185 "num_base_bdevs": 3, 00:12:02.185 "num_base_bdevs_discovered": 3, 00:12:02.185 "num_base_bdevs_operational": 3, 00:12:02.185 "base_bdevs_list": [ 00:12:02.185 { 00:12:02.185 "name": "NewBaseBdev", 00:12:02.185 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:12:02.185 "is_configured": true, 00:12:02.185 "data_offset": 2048, 00:12:02.185 "data_size": 63488 00:12:02.185 }, 00:12:02.185 { 00:12:02.185 "name": "BaseBdev2", 00:12:02.185 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:12:02.185 "is_configured": true, 00:12:02.185 "data_offset": 2048, 00:12:02.185 "data_size": 63488 00:12:02.185 }, 00:12:02.185 { 00:12:02.185 "name": "BaseBdev3", 00:12:02.185 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:12:02.185 "is_configured": true, 00:12:02.185 "data_offset": 2048, 00:12:02.185 "data_size": 63488 00:12:02.185 } 00:12:02.185 ] 00:12:02.185 }' 00:12:02.185 13:40:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.185 13:40:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:02.755 [2024-06-10 13:40:17.199479] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:02.755 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:02.755 "name": "Existed_Raid", 00:12:02.755 "aliases": [ 00:12:02.755 "82c9ae79-ef4a-4018-9492-d6d37ef3425b" 00:12:02.755 ], 00:12:02.755 "product_name": "Raid Volume", 00:12:02.755 "block_size": 512, 00:12:02.755 "num_blocks": 190464, 00:12:02.755 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:12:02.755 "assigned_rate_limits": { 00:12:02.755 "rw_ios_per_sec": 0, 00:12:02.755 "rw_mbytes_per_sec": 0, 00:12:02.755 "r_mbytes_per_sec": 0, 00:12:02.755 "w_mbytes_per_sec": 0 00:12:02.755 }, 00:12:02.755 "claimed": false, 00:12:02.755 "zoned": false, 00:12:02.755 "supported_io_types": { 00:12:02.755 "read": true, 00:12:02.755 "write": true, 00:12:02.755 "unmap": true, 00:12:02.755 "write_zeroes": true, 00:12:02.755 "flush": true, 00:12:02.755 "reset": true, 00:12:02.755 "compare": false, 00:12:02.756 "compare_and_write": false, 00:12:02.756 "abort": false, 00:12:02.756 "nvme_admin": false, 00:12:02.756 "nvme_io": false 00:12:02.756 }, 00:12:02.756 "memory_domains": [ 00:12:02.756 { 00:12:02.756 "dma_device_id": "system", 00:12:02.756 "dma_device_type": 1 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.756 "dma_device_type": 2 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "dma_device_id": "system", 00:12:02.756 "dma_device_type": 1 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.756 "dma_device_type": 2 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "dma_device_id": "system", 00:12:02.756 "dma_device_type": 1 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.756 "dma_device_type": 2 00:12:02.756 } 00:12:02.756 ], 00:12:02.756 "driver_specific": { 00:12:02.756 "raid": { 00:12:02.756 "uuid": "82c9ae79-ef4a-4018-9492-d6d37ef3425b", 00:12:02.756 "strip_size_kb": 64, 00:12:02.756 "state": "online", 00:12:02.756 "raid_level": "raid0", 00:12:02.756 "superblock": true, 00:12:02.756 "num_base_bdevs": 3, 00:12:02.756 "num_base_bdevs_discovered": 3, 00:12:02.756 "num_base_bdevs_operational": 3, 00:12:02.756 "base_bdevs_list": [ 00:12:02.756 { 00:12:02.756 "name": "NewBaseBdev", 00:12:02.756 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:12:02.756 "is_configured": true, 00:12:02.756 "data_offset": 2048, 00:12:02.756 "data_size": 63488 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "name": "BaseBdev2", 00:12:02.756 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:12:02.756 "is_configured": true, 00:12:02.756 "data_offset": 2048, 00:12:02.756 "data_size": 63488 00:12:02.756 }, 00:12:02.756 { 00:12:02.756 "name": "BaseBdev3", 00:12:02.756 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:12:02.756 "is_configured": true, 00:12:02.756 "data_offset": 2048, 00:12:02.756 "data_size": 63488 00:12:02.756 } 00:12:02.756 ] 00:12:02.756 } 00:12:02.756 } 00:12:02.756 }' 00:12:02.756 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.016 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:03.016 BaseBdev2 00:12:03.016 BaseBdev3' 00:12:03.016 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.016 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:03.016 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.016 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.016 "name": "NewBaseBdev", 00:12:03.016 "aliases": [ 00:12:03.016 "75cfcd1b-4f26-4d07-a2f0-13890b666265" 00:12:03.016 ], 00:12:03.016 "product_name": "Malloc disk", 00:12:03.016 "block_size": 512, 00:12:03.016 "num_blocks": 65536, 00:12:03.016 "uuid": "75cfcd1b-4f26-4d07-a2f0-13890b666265", 00:12:03.016 "assigned_rate_limits": { 00:12:03.016 "rw_ios_per_sec": 0, 00:12:03.016 "rw_mbytes_per_sec": 0, 00:12:03.016 "r_mbytes_per_sec": 0, 00:12:03.016 "w_mbytes_per_sec": 0 00:12:03.016 }, 00:12:03.016 "claimed": true, 00:12:03.016 "claim_type": "exclusive_write", 00:12:03.016 "zoned": false, 00:12:03.016 "supported_io_types": { 00:12:03.016 "read": true, 00:12:03.016 "write": true, 00:12:03.016 "unmap": true, 00:12:03.016 "write_zeroes": true, 00:12:03.016 "flush": true, 00:12:03.016 "reset": true, 00:12:03.016 "compare": false, 00:12:03.016 "compare_and_write": false, 00:12:03.016 "abort": true, 00:12:03.016 "nvme_admin": false, 00:12:03.016 "nvme_io": false 00:12:03.016 }, 00:12:03.016 "memory_domains": [ 00:12:03.016 { 00:12:03.016 "dma_device_id": "system", 00:12:03.016 "dma_device_type": 1 00:12:03.016 }, 00:12:03.016 { 00:12:03.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.016 "dma_device_type": 2 00:12:03.016 } 00:12:03.016 ], 00:12:03.016 "driver_specific": {} 00:12:03.016 }' 00:12:03.016 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.276 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.537 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.537 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.537 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.537 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:03.537 13:40:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.798 "name": "BaseBdev2", 00:12:03.798 "aliases": [ 00:12:03.798 "68becb05-af9b-4c8b-ace5-0ec546436447" 00:12:03.798 ], 00:12:03.798 "product_name": "Malloc disk", 00:12:03.798 "block_size": 512, 00:12:03.798 "num_blocks": 65536, 00:12:03.798 "uuid": "68becb05-af9b-4c8b-ace5-0ec546436447", 00:12:03.798 "assigned_rate_limits": { 00:12:03.798 "rw_ios_per_sec": 0, 00:12:03.798 "rw_mbytes_per_sec": 0, 00:12:03.798 "r_mbytes_per_sec": 0, 00:12:03.798 "w_mbytes_per_sec": 0 00:12:03.798 }, 00:12:03.798 "claimed": true, 00:12:03.798 "claim_type": "exclusive_write", 00:12:03.798 "zoned": false, 00:12:03.798 "supported_io_types": { 00:12:03.798 "read": true, 00:12:03.798 "write": true, 00:12:03.798 "unmap": true, 00:12:03.798 "write_zeroes": true, 00:12:03.798 "flush": true, 00:12:03.798 "reset": true, 00:12:03.798 "compare": false, 00:12:03.798 "compare_and_write": false, 00:12:03.798 "abort": true, 00:12:03.798 "nvme_admin": false, 00:12:03.798 "nvme_io": false 00:12:03.798 }, 00:12:03.798 "memory_domains": [ 00:12:03.798 { 00:12:03.798 "dma_device_id": "system", 00:12:03.798 "dma_device_type": 1 00:12:03.798 }, 00:12:03.798 { 00:12:03.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.798 "dma_device_type": 2 00:12:03.798 } 00:12:03.798 ], 00:12:03.798 "driver_specific": {} 00:12:03.798 }' 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.798 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:04.059 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.319 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.319 "name": "BaseBdev3", 00:12:04.319 "aliases": [ 00:12:04.319 "04708b85-0561-48df-bfcb-d0e14d66346c" 00:12:04.319 ], 00:12:04.319 "product_name": "Malloc disk", 00:12:04.319 "block_size": 512, 00:12:04.319 "num_blocks": 65536, 00:12:04.319 "uuid": "04708b85-0561-48df-bfcb-d0e14d66346c", 00:12:04.319 "assigned_rate_limits": { 00:12:04.319 "rw_ios_per_sec": 0, 00:12:04.319 "rw_mbytes_per_sec": 0, 00:12:04.319 "r_mbytes_per_sec": 0, 00:12:04.319 "w_mbytes_per_sec": 0 00:12:04.319 }, 00:12:04.319 "claimed": true, 00:12:04.319 "claim_type": "exclusive_write", 00:12:04.319 "zoned": false, 00:12:04.319 "supported_io_types": { 00:12:04.319 "read": true, 00:12:04.319 "write": true, 00:12:04.319 "unmap": true, 00:12:04.319 "write_zeroes": true, 00:12:04.319 "flush": true, 00:12:04.319 "reset": true, 00:12:04.319 "compare": false, 00:12:04.319 "compare_and_write": false, 00:12:04.319 "abort": true, 00:12:04.319 "nvme_admin": false, 00:12:04.319 "nvme_io": false 00:12:04.319 }, 00:12:04.319 "memory_domains": [ 00:12:04.319 { 00:12:04.319 "dma_device_id": "system", 00:12:04.319 "dma_device_type": 1 00:12:04.319 }, 00:12:04.319 { 00:12:04.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.319 "dma_device_type": 2 00:12:04.319 } 00:12:04.319 ], 00:12:04.319 "driver_specific": {} 00:12:04.319 }' 00:12:04.319 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.319 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.320 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.320 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.320 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.320 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.320 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.580 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.580 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.580 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.580 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.580 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.580 13:40:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:04.841 [2024-06-10 13:40:19.104084] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:04.841 [2024-06-10 13:40:19.104103] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:04.841 [2024-06-10 13:40:19.104148] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:04.841 [2024-06-10 13:40:19.104195] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:04.841 [2024-06-10 13:40:19.104202] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xac52f0 name Existed_Raid, state offline 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1518277 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1518277 ']' 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1518277 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1518277 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1518277' 00:12:04.841 killing process with pid 1518277 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1518277 00:12:04.841 [2024-06-10 13:40:19.171659] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1518277 00:12:04.841 [2024-06-10 13:40:19.187093] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:04.841 00:12:04.841 real 0m24.603s 00:12:04.841 user 0m46.086s 00:12:04.841 sys 0m3.593s 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:04.841 13:40:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:04.841 ************************************ 00:12:04.841 END TEST raid_state_function_test_sb 00:12:04.841 ************************************ 00:12:05.103 13:40:19 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:12:05.103 13:40:19 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:12:05.103 13:40:19 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:05.103 13:40:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:05.103 ************************************ 00:12:05.103 START TEST raid_superblock_test 00:12:05.103 ************************************ 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 3 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1523698 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1523698 /var/tmp/spdk-raid.sock 00:12:05.103 13:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:05.104 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1523698 ']' 00:12:05.104 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:05.104 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:05.104 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:05.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:05.104 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:05.104 13:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.104 [2024-06-10 13:40:19.432323] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:12:05.104 [2024-06-10 13:40:19.432368] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1523698 ] 00:12:05.104 [2024-06-10 13:40:19.501220] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.104 [2024-06-10 13:40:19.565983] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.364 [2024-06-10 13:40:19.622701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.364 [2024-06-10 13:40:19.622728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:05.935 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:06.195 malloc1 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:06.195 [2024-06-10 13:40:20.641961] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:06.195 [2024-06-10 13:40:20.641994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.195 [2024-06-10 13:40:20.642007] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa2b550 00:12:06.195 [2024-06-10 13:40:20.642014] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.195 [2024-06-10 13:40:20.643331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.195 [2024-06-10 13:40:20.643353] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:06.195 pt1 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:06.195 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:06.455 malloc2 00:12:06.455 13:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:06.715 [2024-06-10 13:40:21.045221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:06.715 [2024-06-10 13:40:21.045250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.715 [2024-06-10 13:40:21.045261] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaed0f0 00:12:06.715 [2024-06-10 13:40:21.045268] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.715 [2024-06-10 13:40:21.046512] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.715 [2024-06-10 13:40:21.046531] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:06.715 pt2 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:06.715 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:06.975 malloc3 00:12:06.975 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:06.975 [2024-06-10 13:40:21.448333] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:06.975 [2024-06-10 13:40:21.448360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:06.975 [2024-06-10 13:40:21.448369] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaee5b0 00:12:06.975 [2024-06-10 13:40:21.448376] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:06.975 [2024-06-10 13:40:21.449626] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:06.975 [2024-06-10 13:40:21.449646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:07.236 pt3 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:07.236 [2024-06-10 13:40:21.636824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:07.236 [2024-06-10 13:40:21.637870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:07.236 [2024-06-10 13:40:21.637915] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:07.236 [2024-06-10 13:40:21.638036] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa23f30 00:12:07.236 [2024-06-10 13:40:21.638044] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:07.236 [2024-06-10 13:40:21.638206] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa2d500 00:12:07.236 [2024-06-10 13:40:21.638317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa23f30 00:12:07.236 [2024-06-10 13:40:21.638323] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa23f30 00:12:07.236 [2024-06-10 13:40:21.638396] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.236 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:07.496 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.496 "name": "raid_bdev1", 00:12:07.496 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:07.496 "strip_size_kb": 64, 00:12:07.496 "state": "online", 00:12:07.496 "raid_level": "raid0", 00:12:07.496 "superblock": true, 00:12:07.496 "num_base_bdevs": 3, 00:12:07.496 "num_base_bdevs_discovered": 3, 00:12:07.496 "num_base_bdevs_operational": 3, 00:12:07.496 "base_bdevs_list": [ 00:12:07.496 { 00:12:07.496 "name": "pt1", 00:12:07.496 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:07.496 "is_configured": true, 00:12:07.496 "data_offset": 2048, 00:12:07.496 "data_size": 63488 00:12:07.496 }, 00:12:07.496 { 00:12:07.496 "name": "pt2", 00:12:07.496 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:07.496 "is_configured": true, 00:12:07.496 "data_offset": 2048, 00:12:07.496 "data_size": 63488 00:12:07.496 }, 00:12:07.496 { 00:12:07.496 "name": "pt3", 00:12:07.496 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:07.496 "is_configured": true, 00:12:07.496 "data_offset": 2048, 00:12:07.496 "data_size": 63488 00:12:07.496 } 00:12:07.496 ] 00:12:07.496 }' 00:12:07.496 13:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.496 13:40:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:08.066 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:08.326 [2024-06-10 13:40:22.595431] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:08.326 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:08.326 "name": "raid_bdev1", 00:12:08.326 "aliases": [ 00:12:08.326 "a2a607f5-27c2-4da4-8c21-cc62b73d10ff" 00:12:08.326 ], 00:12:08.326 "product_name": "Raid Volume", 00:12:08.326 "block_size": 512, 00:12:08.326 "num_blocks": 190464, 00:12:08.326 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:08.326 "assigned_rate_limits": { 00:12:08.326 "rw_ios_per_sec": 0, 00:12:08.326 "rw_mbytes_per_sec": 0, 00:12:08.326 "r_mbytes_per_sec": 0, 00:12:08.326 "w_mbytes_per_sec": 0 00:12:08.326 }, 00:12:08.326 "claimed": false, 00:12:08.326 "zoned": false, 00:12:08.326 "supported_io_types": { 00:12:08.326 "read": true, 00:12:08.326 "write": true, 00:12:08.326 "unmap": true, 00:12:08.326 "write_zeroes": true, 00:12:08.326 "flush": true, 00:12:08.326 "reset": true, 00:12:08.326 "compare": false, 00:12:08.326 "compare_and_write": false, 00:12:08.326 "abort": false, 00:12:08.327 "nvme_admin": false, 00:12:08.327 "nvme_io": false 00:12:08.327 }, 00:12:08.327 "memory_domains": [ 00:12:08.327 { 00:12:08.327 "dma_device_id": "system", 00:12:08.327 "dma_device_type": 1 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.327 "dma_device_type": 2 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "dma_device_id": "system", 00:12:08.327 "dma_device_type": 1 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.327 "dma_device_type": 2 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "dma_device_id": "system", 00:12:08.327 "dma_device_type": 1 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.327 "dma_device_type": 2 00:12:08.327 } 00:12:08.327 ], 00:12:08.327 "driver_specific": { 00:12:08.327 "raid": { 00:12:08.327 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:08.327 "strip_size_kb": 64, 00:12:08.327 "state": "online", 00:12:08.327 "raid_level": "raid0", 00:12:08.327 "superblock": true, 00:12:08.327 "num_base_bdevs": 3, 00:12:08.327 "num_base_bdevs_discovered": 3, 00:12:08.327 "num_base_bdevs_operational": 3, 00:12:08.327 "base_bdevs_list": [ 00:12:08.327 { 00:12:08.327 "name": "pt1", 00:12:08.327 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:08.327 "is_configured": true, 00:12:08.327 "data_offset": 2048, 00:12:08.327 "data_size": 63488 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "name": "pt2", 00:12:08.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:08.327 "is_configured": true, 00:12:08.327 "data_offset": 2048, 00:12:08.327 "data_size": 63488 00:12:08.327 }, 00:12:08.327 { 00:12:08.327 "name": "pt3", 00:12:08.327 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:08.327 "is_configured": true, 00:12:08.327 "data_offset": 2048, 00:12:08.327 "data_size": 63488 00:12:08.327 } 00:12:08.327 ] 00:12:08.327 } 00:12:08.327 } 00:12:08.327 }' 00:12:08.327 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:08.327 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:08.327 pt2 00:12:08.327 pt3' 00:12:08.327 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.327 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:08.327 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.587 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.587 "name": "pt1", 00:12:08.587 "aliases": [ 00:12:08.587 "00000000-0000-0000-0000-000000000001" 00:12:08.587 ], 00:12:08.588 "product_name": "passthru", 00:12:08.588 "block_size": 512, 00:12:08.588 "num_blocks": 65536, 00:12:08.588 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:08.588 "assigned_rate_limits": { 00:12:08.588 "rw_ios_per_sec": 0, 00:12:08.588 "rw_mbytes_per_sec": 0, 00:12:08.588 "r_mbytes_per_sec": 0, 00:12:08.588 "w_mbytes_per_sec": 0 00:12:08.588 }, 00:12:08.588 "claimed": true, 00:12:08.588 "claim_type": "exclusive_write", 00:12:08.588 "zoned": false, 00:12:08.588 "supported_io_types": { 00:12:08.588 "read": true, 00:12:08.588 "write": true, 00:12:08.588 "unmap": true, 00:12:08.588 "write_zeroes": true, 00:12:08.588 "flush": true, 00:12:08.588 "reset": true, 00:12:08.588 "compare": false, 00:12:08.588 "compare_and_write": false, 00:12:08.588 "abort": true, 00:12:08.588 "nvme_admin": false, 00:12:08.588 "nvme_io": false 00:12:08.588 }, 00:12:08.588 "memory_domains": [ 00:12:08.588 { 00:12:08.588 "dma_device_id": "system", 00:12:08.588 "dma_device_type": 1 00:12:08.588 }, 00:12:08.588 { 00:12:08.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.588 "dma_device_type": 2 00:12:08.588 } 00:12:08.588 ], 00:12:08.588 "driver_specific": { 00:12:08.588 "passthru": { 00:12:08.588 "name": "pt1", 00:12:08.588 "base_bdev_name": "malloc1" 00:12:08.588 } 00:12:08.588 } 00:12:08.588 }' 00:12:08.588 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.588 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.588 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.588 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.588 13:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.588 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:08.588 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.848 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.848 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.848 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.849 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.849 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.849 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.849 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:08.849 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:09.109 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:09.109 "name": "pt2", 00:12:09.109 "aliases": [ 00:12:09.109 "00000000-0000-0000-0000-000000000002" 00:12:09.109 ], 00:12:09.109 "product_name": "passthru", 00:12:09.109 "block_size": 512, 00:12:09.109 "num_blocks": 65536, 00:12:09.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:09.109 "assigned_rate_limits": { 00:12:09.109 "rw_ios_per_sec": 0, 00:12:09.109 "rw_mbytes_per_sec": 0, 00:12:09.109 "r_mbytes_per_sec": 0, 00:12:09.109 "w_mbytes_per_sec": 0 00:12:09.109 }, 00:12:09.109 "claimed": true, 00:12:09.109 "claim_type": "exclusive_write", 00:12:09.109 "zoned": false, 00:12:09.109 "supported_io_types": { 00:12:09.109 "read": true, 00:12:09.109 "write": true, 00:12:09.109 "unmap": true, 00:12:09.109 "write_zeroes": true, 00:12:09.109 "flush": true, 00:12:09.109 "reset": true, 00:12:09.109 "compare": false, 00:12:09.109 "compare_and_write": false, 00:12:09.109 "abort": true, 00:12:09.109 "nvme_admin": false, 00:12:09.109 "nvme_io": false 00:12:09.109 }, 00:12:09.109 "memory_domains": [ 00:12:09.109 { 00:12:09.109 "dma_device_id": "system", 00:12:09.109 "dma_device_type": 1 00:12:09.109 }, 00:12:09.109 { 00:12:09.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.109 "dma_device_type": 2 00:12:09.109 } 00:12:09.109 ], 00:12:09.109 "driver_specific": { 00:12:09.109 "passthru": { 00:12:09.109 "name": "pt2", 00:12:09.109 "base_bdev_name": "malloc2" 00:12:09.109 } 00:12:09.109 } 00:12:09.109 }' 00:12:09.109 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.109 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.109 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:09.109 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.109 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:09.369 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:09.629 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:09.629 "name": "pt3", 00:12:09.629 "aliases": [ 00:12:09.629 "00000000-0000-0000-0000-000000000003" 00:12:09.629 ], 00:12:09.629 "product_name": "passthru", 00:12:09.629 "block_size": 512, 00:12:09.629 "num_blocks": 65536, 00:12:09.629 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:09.629 "assigned_rate_limits": { 00:12:09.629 "rw_ios_per_sec": 0, 00:12:09.629 "rw_mbytes_per_sec": 0, 00:12:09.629 "r_mbytes_per_sec": 0, 00:12:09.629 "w_mbytes_per_sec": 0 00:12:09.629 }, 00:12:09.629 "claimed": true, 00:12:09.629 "claim_type": "exclusive_write", 00:12:09.629 "zoned": false, 00:12:09.629 "supported_io_types": { 00:12:09.629 "read": true, 00:12:09.629 "write": true, 00:12:09.629 "unmap": true, 00:12:09.629 "write_zeroes": true, 00:12:09.629 "flush": true, 00:12:09.629 "reset": true, 00:12:09.629 "compare": false, 00:12:09.629 "compare_and_write": false, 00:12:09.629 "abort": true, 00:12:09.629 "nvme_admin": false, 00:12:09.629 "nvme_io": false 00:12:09.629 }, 00:12:09.629 "memory_domains": [ 00:12:09.629 { 00:12:09.629 "dma_device_id": "system", 00:12:09.629 "dma_device_type": 1 00:12:09.629 }, 00:12:09.629 { 00:12:09.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.629 "dma_device_type": 2 00:12:09.629 } 00:12:09.629 ], 00:12:09.629 "driver_specific": { 00:12:09.629 "passthru": { 00:12:09.629 "name": "pt3", 00:12:09.629 "base_bdev_name": "malloc3" 00:12:09.629 } 00:12:09.629 } 00:12:09.629 }' 00:12:09.629 13:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.629 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.629 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:09.629 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:09.889 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:10.149 [2024-06-10 13:40:24.524336] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.149 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a2a607f5-27c2-4da4-8c21-cc62b73d10ff 00:12:10.149 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a2a607f5-27c2-4da4-8c21-cc62b73d10ff ']' 00:12:10.149 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:10.410 [2024-06-10 13:40:24.728632] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:10.410 [2024-06-10 13:40:24.728644] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:10.410 [2024-06-10 13:40:24.728681] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:10.410 [2024-06-10 13:40:24.728720] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:10.410 [2024-06-10 13:40:24.728726] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa23f30 name raid_bdev1, state offline 00:12:10.410 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.410 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:10.670 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:10.670 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:10.670 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:10.670 13:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:10.670 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:10.670 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:10.930 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:10.930 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:11.191 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:11.191 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:11.452 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:11.452 [2024-06-10 13:40:25.915605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:11.452 [2024-06-10 13:40:25.916741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:11.452 [2024-06-10 13:40:25.916774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:11.452 [2024-06-10 13:40:25.916811] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:11.452 [2024-06-10 13:40:25.916839] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:11.452 [2024-06-10 13:40:25.916854] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:11.452 [2024-06-10 13:40:25.916865] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:11.452 [2024-06-10 13:40:25.916870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa2d2b0 name raid_bdev1, state configuring 00:12:11.452 request: 00:12:11.452 { 00:12:11.452 "name": "raid_bdev1", 00:12:11.452 "raid_level": "raid0", 00:12:11.452 "base_bdevs": [ 00:12:11.452 "malloc1", 00:12:11.452 "malloc2", 00:12:11.452 "malloc3" 00:12:11.452 ], 00:12:11.452 "superblock": false, 00:12:11.452 "strip_size_kb": 64, 00:12:11.452 "method": "bdev_raid_create", 00:12:11.452 "req_id": 1 00:12:11.452 } 00:12:11.452 Got JSON-RPC error response 00:12:11.452 response: 00:12:11.452 { 00:12:11.452 "code": -17, 00:12:11.452 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:11.452 } 00:12:11.712 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:12:11.712 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:12:11.712 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:12:11.712 13:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:12:11.712 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.712 13:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:11.712 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:11.712 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:11.712 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:11.973 [2024-06-10 13:40:26.320582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:11.973 [2024-06-10 13:40:26.320602] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:11.973 [2024-06-10 13:40:26.320612] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa2e7c0 00:12:11.973 [2024-06-10 13:40:26.320618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:11.973 [2024-06-10 13:40:26.321950] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:11.973 [2024-06-10 13:40:26.321971] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:11.973 [2024-06-10 13:40:26.322015] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:11.973 [2024-06-10 13:40:26.322031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:11.973 pt1 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.973 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:12.234 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.234 "name": "raid_bdev1", 00:12:12.234 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:12.234 "strip_size_kb": 64, 00:12:12.234 "state": "configuring", 00:12:12.234 "raid_level": "raid0", 00:12:12.234 "superblock": true, 00:12:12.234 "num_base_bdevs": 3, 00:12:12.234 "num_base_bdevs_discovered": 1, 00:12:12.234 "num_base_bdevs_operational": 3, 00:12:12.234 "base_bdevs_list": [ 00:12:12.234 { 00:12:12.234 "name": "pt1", 00:12:12.234 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:12.234 "is_configured": true, 00:12:12.234 "data_offset": 2048, 00:12:12.234 "data_size": 63488 00:12:12.234 }, 00:12:12.234 { 00:12:12.234 "name": null, 00:12:12.234 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:12.234 "is_configured": false, 00:12:12.234 "data_offset": 2048, 00:12:12.234 "data_size": 63488 00:12:12.234 }, 00:12:12.234 { 00:12:12.234 "name": null, 00:12:12.234 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:12.234 "is_configured": false, 00:12:12.234 "data_offset": 2048, 00:12:12.234 "data_size": 63488 00:12:12.234 } 00:12:12.234 ] 00:12:12.234 }' 00:12:12.234 13:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.234 13:40:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.804 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:12:12.804 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:13.065 [2024-06-10 13:40:27.283027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:13.065 [2024-06-10 13:40:27.283055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:13.065 [2024-06-10 13:40:27.283066] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaed320 00:12:13.065 [2024-06-10 13:40:27.283072] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:13.065 [2024-06-10 13:40:27.283349] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:13.065 [2024-06-10 13:40:27.283363] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:13.065 [2024-06-10 13:40:27.283404] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:13.065 [2024-06-10 13:40:27.283416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:13.065 pt2 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:13.065 [2024-06-10 13:40:27.487554] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:13.065 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.325 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.325 "name": "raid_bdev1", 00:12:13.325 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:13.325 "strip_size_kb": 64, 00:12:13.325 "state": "configuring", 00:12:13.325 "raid_level": "raid0", 00:12:13.325 "superblock": true, 00:12:13.325 "num_base_bdevs": 3, 00:12:13.325 "num_base_bdevs_discovered": 1, 00:12:13.325 "num_base_bdevs_operational": 3, 00:12:13.325 "base_bdevs_list": [ 00:12:13.325 { 00:12:13.325 "name": "pt1", 00:12:13.325 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:13.325 "is_configured": true, 00:12:13.325 "data_offset": 2048, 00:12:13.325 "data_size": 63488 00:12:13.325 }, 00:12:13.325 { 00:12:13.325 "name": null, 00:12:13.325 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:13.325 "is_configured": false, 00:12:13.325 "data_offset": 2048, 00:12:13.325 "data_size": 63488 00:12:13.325 }, 00:12:13.325 { 00:12:13.325 "name": null, 00:12:13.325 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:13.325 "is_configured": false, 00:12:13.325 "data_offset": 2048, 00:12:13.325 "data_size": 63488 00:12:13.325 } 00:12:13.325 ] 00:12:13.325 }' 00:12:13.325 13:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.325 13:40:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.897 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:13.897 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:13.897 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:14.158 [2024-06-10 13:40:28.417919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:14.158 [2024-06-10 13:40:28.417943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.158 [2024-06-10 13:40:28.417954] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa2b780 00:12:14.158 [2024-06-10 13:40:28.417961] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.158 [2024-06-10 13:40:28.418242] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.158 [2024-06-10 13:40:28.418254] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:14.158 [2024-06-10 13:40:28.418296] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:14.158 [2024-06-10 13:40:28.418308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:14.158 pt2 00:12:14.158 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:14.158 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:14.158 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:14.158 [2024-06-10 13:40:28.618434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:14.158 [2024-06-10 13:40:28.618455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:14.158 [2024-06-10 13:40:28.618466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa21ea0 00:12:14.158 [2024-06-10 13:40:28.618472] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:14.158 [2024-06-10 13:40:28.618723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:14.159 [2024-06-10 13:40:28.618734] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:14.159 [2024-06-10 13:40:28.618777] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:12:14.159 [2024-06-10 13:40:28.618789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:14.159 [2024-06-10 13:40:28.618873] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa2d660 00:12:14.159 [2024-06-10 13:40:28.618879] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:14.159 [2024-06-10 13:40:28.619021] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa27550 00:12:14.159 [2024-06-10 13:40:28.619123] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa2d660 00:12:14.159 [2024-06-10 13:40:28.619128] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa2d660 00:12:14.159 [2024-06-10 13:40:28.619216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:14.159 pt3 00:12:14.159 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.419 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.419 "name": "raid_bdev1", 00:12:14.419 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:14.419 "strip_size_kb": 64, 00:12:14.419 "state": "online", 00:12:14.419 "raid_level": "raid0", 00:12:14.419 "superblock": true, 00:12:14.419 "num_base_bdevs": 3, 00:12:14.419 "num_base_bdevs_discovered": 3, 00:12:14.419 "num_base_bdevs_operational": 3, 00:12:14.419 "base_bdevs_list": [ 00:12:14.419 { 00:12:14.419 "name": "pt1", 00:12:14.419 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:14.419 "is_configured": true, 00:12:14.419 "data_offset": 2048, 00:12:14.419 "data_size": 63488 00:12:14.419 }, 00:12:14.419 { 00:12:14.419 "name": "pt2", 00:12:14.419 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.419 "is_configured": true, 00:12:14.419 "data_offset": 2048, 00:12:14.419 "data_size": 63488 00:12:14.419 }, 00:12:14.419 { 00:12:14.420 "name": "pt3", 00:12:14.420 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:14.420 "is_configured": true, 00:12:14.420 "data_offset": 2048, 00:12:14.420 "data_size": 63488 00:12:14.420 } 00:12:14.420 ] 00:12:14.420 }' 00:12:14.420 13:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.420 13:40:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.989 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:14.989 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:14.990 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:14.990 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:14.990 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:14.990 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:14.990 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:14.990 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:15.249 [2024-06-10 13:40:29.561045] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:15.249 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:15.249 "name": "raid_bdev1", 00:12:15.249 "aliases": [ 00:12:15.249 "a2a607f5-27c2-4da4-8c21-cc62b73d10ff" 00:12:15.249 ], 00:12:15.249 "product_name": "Raid Volume", 00:12:15.249 "block_size": 512, 00:12:15.249 "num_blocks": 190464, 00:12:15.249 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:15.249 "assigned_rate_limits": { 00:12:15.249 "rw_ios_per_sec": 0, 00:12:15.249 "rw_mbytes_per_sec": 0, 00:12:15.249 "r_mbytes_per_sec": 0, 00:12:15.249 "w_mbytes_per_sec": 0 00:12:15.249 }, 00:12:15.249 "claimed": false, 00:12:15.249 "zoned": false, 00:12:15.249 "supported_io_types": { 00:12:15.249 "read": true, 00:12:15.249 "write": true, 00:12:15.249 "unmap": true, 00:12:15.249 "write_zeroes": true, 00:12:15.249 "flush": true, 00:12:15.249 "reset": true, 00:12:15.249 "compare": false, 00:12:15.249 "compare_and_write": false, 00:12:15.249 "abort": false, 00:12:15.249 "nvme_admin": false, 00:12:15.249 "nvme_io": false 00:12:15.249 }, 00:12:15.249 "memory_domains": [ 00:12:15.249 { 00:12:15.249 "dma_device_id": "system", 00:12:15.249 "dma_device_type": 1 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.249 "dma_device_type": 2 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "dma_device_id": "system", 00:12:15.249 "dma_device_type": 1 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.249 "dma_device_type": 2 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "dma_device_id": "system", 00:12:15.249 "dma_device_type": 1 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.249 "dma_device_type": 2 00:12:15.249 } 00:12:15.249 ], 00:12:15.249 "driver_specific": { 00:12:15.249 "raid": { 00:12:15.249 "uuid": "a2a607f5-27c2-4da4-8c21-cc62b73d10ff", 00:12:15.249 "strip_size_kb": 64, 00:12:15.249 "state": "online", 00:12:15.249 "raid_level": "raid0", 00:12:15.249 "superblock": true, 00:12:15.249 "num_base_bdevs": 3, 00:12:15.249 "num_base_bdevs_discovered": 3, 00:12:15.249 "num_base_bdevs_operational": 3, 00:12:15.249 "base_bdevs_list": [ 00:12:15.249 { 00:12:15.249 "name": "pt1", 00:12:15.249 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:15.249 "is_configured": true, 00:12:15.249 "data_offset": 2048, 00:12:15.249 "data_size": 63488 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "name": "pt2", 00:12:15.249 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:15.249 "is_configured": true, 00:12:15.249 "data_offset": 2048, 00:12:15.249 "data_size": 63488 00:12:15.249 }, 00:12:15.249 { 00:12:15.249 "name": "pt3", 00:12:15.249 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:15.249 "is_configured": true, 00:12:15.249 "data_offset": 2048, 00:12:15.249 "data_size": 63488 00:12:15.249 } 00:12:15.249 ] 00:12:15.249 } 00:12:15.249 } 00:12:15.249 }' 00:12:15.249 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:15.249 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:15.249 pt2 00:12:15.249 pt3' 00:12:15.249 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:15.249 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:15.249 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:15.509 "name": "pt1", 00:12:15.509 "aliases": [ 00:12:15.509 "00000000-0000-0000-0000-000000000001" 00:12:15.509 ], 00:12:15.509 "product_name": "passthru", 00:12:15.509 "block_size": 512, 00:12:15.509 "num_blocks": 65536, 00:12:15.509 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:15.509 "assigned_rate_limits": { 00:12:15.509 "rw_ios_per_sec": 0, 00:12:15.509 "rw_mbytes_per_sec": 0, 00:12:15.509 "r_mbytes_per_sec": 0, 00:12:15.509 "w_mbytes_per_sec": 0 00:12:15.509 }, 00:12:15.509 "claimed": true, 00:12:15.509 "claim_type": "exclusive_write", 00:12:15.509 "zoned": false, 00:12:15.509 "supported_io_types": { 00:12:15.509 "read": true, 00:12:15.509 "write": true, 00:12:15.509 "unmap": true, 00:12:15.509 "write_zeroes": true, 00:12:15.509 "flush": true, 00:12:15.509 "reset": true, 00:12:15.509 "compare": false, 00:12:15.509 "compare_and_write": false, 00:12:15.509 "abort": true, 00:12:15.509 "nvme_admin": false, 00:12:15.509 "nvme_io": false 00:12:15.509 }, 00:12:15.509 "memory_domains": [ 00:12:15.509 { 00:12:15.509 "dma_device_id": "system", 00:12:15.509 "dma_device_type": 1 00:12:15.509 }, 00:12:15.509 { 00:12:15.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.509 "dma_device_type": 2 00:12:15.509 } 00:12:15.509 ], 00:12:15.509 "driver_specific": { 00:12:15.509 "passthru": { 00:12:15.509 "name": "pt1", 00:12:15.509 "base_bdev_name": "malloc1" 00:12:15.509 } 00:12:15.509 } 00:12:15.509 }' 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:15.509 13:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:15.769 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:16.028 "name": "pt2", 00:12:16.028 "aliases": [ 00:12:16.028 "00000000-0000-0000-0000-000000000002" 00:12:16.028 ], 00:12:16.028 "product_name": "passthru", 00:12:16.028 "block_size": 512, 00:12:16.028 "num_blocks": 65536, 00:12:16.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:16.028 "assigned_rate_limits": { 00:12:16.028 "rw_ios_per_sec": 0, 00:12:16.028 "rw_mbytes_per_sec": 0, 00:12:16.028 "r_mbytes_per_sec": 0, 00:12:16.028 "w_mbytes_per_sec": 0 00:12:16.028 }, 00:12:16.028 "claimed": true, 00:12:16.028 "claim_type": "exclusive_write", 00:12:16.028 "zoned": false, 00:12:16.028 "supported_io_types": { 00:12:16.028 "read": true, 00:12:16.028 "write": true, 00:12:16.028 "unmap": true, 00:12:16.028 "write_zeroes": true, 00:12:16.028 "flush": true, 00:12:16.028 "reset": true, 00:12:16.028 "compare": false, 00:12:16.028 "compare_and_write": false, 00:12:16.028 "abort": true, 00:12:16.028 "nvme_admin": false, 00:12:16.028 "nvme_io": false 00:12:16.028 }, 00:12:16.028 "memory_domains": [ 00:12:16.028 { 00:12:16.028 "dma_device_id": "system", 00:12:16.028 "dma_device_type": 1 00:12:16.028 }, 00:12:16.028 { 00:12:16.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.028 "dma_device_type": 2 00:12:16.028 } 00:12:16.028 ], 00:12:16.028 "driver_specific": { 00:12:16.028 "passthru": { 00:12:16.028 "name": "pt2", 00:12:16.028 "base_bdev_name": "malloc2" 00:12:16.028 } 00:12:16.028 } 00:12:16.028 }' 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:16.028 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:16.287 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:16.548 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:16.548 "name": "pt3", 00:12:16.548 "aliases": [ 00:12:16.549 "00000000-0000-0000-0000-000000000003" 00:12:16.549 ], 00:12:16.549 "product_name": "passthru", 00:12:16.549 "block_size": 512, 00:12:16.549 "num_blocks": 65536, 00:12:16.549 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:16.549 "assigned_rate_limits": { 00:12:16.549 "rw_ios_per_sec": 0, 00:12:16.549 "rw_mbytes_per_sec": 0, 00:12:16.549 "r_mbytes_per_sec": 0, 00:12:16.549 "w_mbytes_per_sec": 0 00:12:16.549 }, 00:12:16.549 "claimed": true, 00:12:16.549 "claim_type": "exclusive_write", 00:12:16.549 "zoned": false, 00:12:16.549 "supported_io_types": { 00:12:16.549 "read": true, 00:12:16.549 "write": true, 00:12:16.549 "unmap": true, 00:12:16.549 "write_zeroes": true, 00:12:16.549 "flush": true, 00:12:16.549 "reset": true, 00:12:16.549 "compare": false, 00:12:16.549 "compare_and_write": false, 00:12:16.549 "abort": true, 00:12:16.549 "nvme_admin": false, 00:12:16.549 "nvme_io": false 00:12:16.549 }, 00:12:16.549 "memory_domains": [ 00:12:16.549 { 00:12:16.549 "dma_device_id": "system", 00:12:16.549 "dma_device_type": 1 00:12:16.549 }, 00:12:16.549 { 00:12:16.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.549 "dma_device_type": 2 00:12:16.549 } 00:12:16.549 ], 00:12:16.549 "driver_specific": { 00:12:16.549 "passthru": { 00:12:16.549 "name": "pt3", 00:12:16.549 "base_bdev_name": "malloc3" 00:12:16.549 } 00:12:16.549 } 00:12:16.549 }' 00:12:16.549 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.549 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.549 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:16.549 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.549 13:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.549 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:16.549 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:16.809 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:17.069 [2024-06-10 13:40:31.389685] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a2a607f5-27c2-4da4-8c21-cc62b73d10ff '!=' a2a607f5-27c2-4da4-8c21-cc62b73d10ff ']' 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1523698 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1523698 ']' 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1523698 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1523698 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1523698' 00:12:17.069 killing process with pid 1523698 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1523698 00:12:17.069 [2024-06-10 13:40:31.461038] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:17.069 [2024-06-10 13:40:31.461084] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.069 [2024-06-10 13:40:31.461126] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:17.069 [2024-06-10 13:40:31.461142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa2d660 name raid_bdev1, state offline 00:12:17.069 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1523698 00:12:17.069 [2024-06-10 13:40:31.476919] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:17.329 13:40:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:17.330 00:12:17.330 real 0m12.223s 00:12:17.330 user 0m22.568s 00:12:17.330 sys 0m1.747s 00:12:17.330 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:17.330 13:40:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.330 ************************************ 00:12:17.330 END TEST raid_superblock_test 00:12:17.330 ************************************ 00:12:17.330 13:40:31 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:12:17.330 13:40:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:17.330 13:40:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:17.330 13:40:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:17.330 ************************************ 00:12:17.330 START TEST raid_read_error_test 00:12:17.330 ************************************ 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 read 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.QmmdkLanoj 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1526417 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1526417 /var/tmp/spdk-raid.sock 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1526417 ']' 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:17.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:17.330 13:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.330 [2024-06-10 13:40:31.761661] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:12:17.330 [2024-06-10 13:40:31.761709] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1526417 ] 00:12:17.591 [2024-06-10 13:40:31.849792] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:17.591 [2024-06-10 13:40:31.914581] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:12:17.591 [2024-06-10 13:40:31.960782] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:17.591 [2024-06-10 13:40:31.960807] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.163 13:40:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:18.163 13:40:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:12:18.163 13:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:18.163 13:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:18.423 BaseBdev1_malloc 00:12:18.423 13:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:18.683 true 00:12:18.683 13:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:18.942 [2024-06-10 13:40:33.168354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:18.942 [2024-06-10 13:40:33.168386] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:18.942 [2024-06-10 13:40:33.168398] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232bc90 00:12:18.942 [2024-06-10 13:40:33.168405] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:18.942 [2024-06-10 13:40:33.169843] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:18.942 [2024-06-10 13:40:33.169864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:18.942 BaseBdev1 00:12:18.942 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:18.942 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:18.942 BaseBdev2_malloc 00:12:18.942 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:19.202 true 00:12:19.202 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:19.462 [2024-06-10 13:40:33.759925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:19.462 [2024-06-10 13:40:33.759959] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:19.462 [2024-06-10 13:40:33.759971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2330400 00:12:19.462 [2024-06-10 13:40:33.759977] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:19.462 [2024-06-10 13:40:33.761242] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:19.462 [2024-06-10 13:40:33.761262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:19.462 BaseBdev2 00:12:19.462 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:19.462 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:19.721 BaseBdev3_malloc 00:12:19.721 13:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:19.721 true 00:12:19.721 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:19.981 [2024-06-10 13:40:34.363507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:19.981 [2024-06-10 13:40:34.363536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:19.981 [2024-06-10 13:40:34.363551] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2332fc0 00:12:19.981 [2024-06-10 13:40:34.363559] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:19.981 [2024-06-10 13:40:34.364815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:19.981 [2024-06-10 13:40:34.364835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:19.981 BaseBdev3 00:12:19.981 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:20.241 [2024-06-10 13:40:34.564036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:20.241 [2024-06-10 13:40:34.565104] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:20.241 [2024-06-10 13:40:34.565158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:20.241 [2024-06-10 13:40:34.565335] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2331060 00:12:20.241 [2024-06-10 13:40:34.565343] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:20.241 [2024-06-10 13:40:34.565493] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2181ea0 00:12:20.241 [2024-06-10 13:40:34.565612] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2331060 00:12:20.241 [2024-06-10 13:40:34.565617] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2331060 00:12:20.241 [2024-06-10 13:40:34.565696] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.241 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:20.501 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.501 "name": "raid_bdev1", 00:12:20.501 "uuid": "226e8130-55b1-455e-9767-0f5a81fd5356", 00:12:20.501 "strip_size_kb": 64, 00:12:20.501 "state": "online", 00:12:20.501 "raid_level": "raid0", 00:12:20.501 "superblock": true, 00:12:20.501 "num_base_bdevs": 3, 00:12:20.501 "num_base_bdevs_discovered": 3, 00:12:20.501 "num_base_bdevs_operational": 3, 00:12:20.501 "base_bdevs_list": [ 00:12:20.501 { 00:12:20.501 "name": "BaseBdev1", 00:12:20.501 "uuid": "d85f3976-e2bc-5e81-818d-a56ebb0be441", 00:12:20.501 "is_configured": true, 00:12:20.501 "data_offset": 2048, 00:12:20.501 "data_size": 63488 00:12:20.501 }, 00:12:20.501 { 00:12:20.501 "name": "BaseBdev2", 00:12:20.501 "uuid": "73118924-a3ac-5026-af8c-6c4160d434cc", 00:12:20.501 "is_configured": true, 00:12:20.501 "data_offset": 2048, 00:12:20.501 "data_size": 63488 00:12:20.501 }, 00:12:20.501 { 00:12:20.501 "name": "BaseBdev3", 00:12:20.501 "uuid": "188f6dd7-94d6-5fca-877d-43293a45d7c7", 00:12:20.501 "is_configured": true, 00:12:20.501 "data_offset": 2048, 00:12:20.501 "data_size": 63488 00:12:20.501 } 00:12:20.501 ] 00:12:20.501 }' 00:12:20.501 13:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.501 13:40:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.070 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:21.070 13:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:21.070 [2024-06-10 13:40:35.422408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e8bde0 00:12:22.008 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.269 "name": "raid_bdev1", 00:12:22.269 "uuid": "226e8130-55b1-455e-9767-0f5a81fd5356", 00:12:22.269 "strip_size_kb": 64, 00:12:22.269 "state": "online", 00:12:22.269 "raid_level": "raid0", 00:12:22.269 "superblock": true, 00:12:22.269 "num_base_bdevs": 3, 00:12:22.269 "num_base_bdevs_discovered": 3, 00:12:22.269 "num_base_bdevs_operational": 3, 00:12:22.269 "base_bdevs_list": [ 00:12:22.269 { 00:12:22.269 "name": "BaseBdev1", 00:12:22.269 "uuid": "d85f3976-e2bc-5e81-818d-a56ebb0be441", 00:12:22.269 "is_configured": true, 00:12:22.269 "data_offset": 2048, 00:12:22.269 "data_size": 63488 00:12:22.269 }, 00:12:22.269 { 00:12:22.269 "name": "BaseBdev2", 00:12:22.269 "uuid": "73118924-a3ac-5026-af8c-6c4160d434cc", 00:12:22.269 "is_configured": true, 00:12:22.269 "data_offset": 2048, 00:12:22.269 "data_size": 63488 00:12:22.269 }, 00:12:22.269 { 00:12:22.269 "name": "BaseBdev3", 00:12:22.269 "uuid": "188f6dd7-94d6-5fca-877d-43293a45d7c7", 00:12:22.269 "is_configured": true, 00:12:22.269 "data_offset": 2048, 00:12:22.269 "data_size": 63488 00:12:22.269 } 00:12:22.269 ] 00:12:22.269 }' 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.269 13:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.839 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:23.099 [2024-06-10 13:40:37.470642] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:23.099 [2024-06-10 13:40:37.470669] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:23.099 [2024-06-10 13:40:37.473464] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:23.099 [2024-06-10 13:40:37.473491] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:23.099 [2024-06-10 13:40:37.473516] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:23.099 [2024-06-10 13:40:37.473522] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2331060 name raid_bdev1, state offline 00:12:23.099 0 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1526417 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1526417 ']' 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1526417 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1526417 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1526417' 00:12:23.099 killing process with pid 1526417 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1526417 00:12:23.099 [2024-06-10 13:40:37.540658] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:23.099 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1526417 00:12:23.099 [2024-06-10 13:40:37.551733] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.QmmdkLanoj 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:12:23.360 00:12:23.360 real 0m6.008s 00:12:23.360 user 0m9.607s 00:12:23.360 sys 0m0.854s 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:23.360 13:40:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.360 ************************************ 00:12:23.360 END TEST raid_read_error_test 00:12:23.360 ************************************ 00:12:23.360 13:40:37 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:12:23.360 13:40:37 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:23.360 13:40:37 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:23.360 13:40:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:23.360 ************************************ 00:12:23.360 START TEST raid_write_error_test 00:12:23.360 ************************************ 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 write 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.18y490mORy 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1527715 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1527715 /var/tmp/spdk-raid.sock 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1527715 ']' 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:23.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.360 13:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:23.360 [2024-06-10 13:40:37.831622] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:12:23.360 [2024-06-10 13:40:37.831678] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1527715 ] 00:12:23.621 [2024-06-10 13:40:37.918267] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:23.621 [2024-06-10 13:40:37.983297] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:12:23.621 [2024-06-10 13:40:38.025394] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:23.621 [2024-06-10 13:40:38.025418] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.562 13:40:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:24.562 13:40:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:12:24.562 13:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:24.562 13:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:24.562 BaseBdev1_malloc 00:12:24.562 13:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:24.822 true 00:12:24.822 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:24.822 [2024-06-10 13:40:39.269225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:24.822 [2024-06-10 13:40:39.269256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:24.822 [2024-06-10 13:40:39.269269] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x195dc90 00:12:24.822 [2024-06-10 13:40:39.269275] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:24.822 [2024-06-10 13:40:39.270730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:24.822 [2024-06-10 13:40:39.270751] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:24.822 BaseBdev1 00:12:24.822 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:24.822 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:25.082 BaseBdev2_malloc 00:12:25.082 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:25.341 true 00:12:25.341 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:25.602 [2024-06-10 13:40:39.872810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:25.602 [2024-06-10 13:40:39.872842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:25.602 [2024-06-10 13:40:39.872854] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1962400 00:12:25.602 [2024-06-10 13:40:39.872861] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:25.602 [2024-06-10 13:40:39.874135] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:25.602 [2024-06-10 13:40:39.874154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:25.602 BaseBdev2 00:12:25.602 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:25.602 13:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:25.862 BaseBdev3_malloc 00:12:25.862 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:25.862 true 00:12:25.862 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:26.122 [2024-06-10 13:40:40.476409] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:26.122 [2024-06-10 13:40:40.476438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.122 [2024-06-10 13:40:40.476453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1964fc0 00:12:26.122 [2024-06-10 13:40:40.476460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.122 [2024-06-10 13:40:40.477742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.122 [2024-06-10 13:40:40.477762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:26.122 BaseBdev3 00:12:26.122 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:26.383 [2024-06-10 13:40:40.660900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.383 [2024-06-10 13:40:40.661969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:26.383 [2024-06-10 13:40:40.662023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:26.383 [2024-06-10 13:40:40.662199] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1963060 00:12:26.383 [2024-06-10 13:40:40.662207] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:26.383 [2024-06-10 13:40:40.662359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b3ea0 00:12:26.383 [2024-06-10 13:40:40.662478] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1963060 00:12:26.383 [2024-06-10 13:40:40.662483] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1963060 00:12:26.383 [2024-06-10 13:40:40.662562] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.383 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:26.643 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.643 "name": "raid_bdev1", 00:12:26.643 "uuid": "5125556f-fc81-4ca2-9511-beee53ada88d", 00:12:26.643 "strip_size_kb": 64, 00:12:26.643 "state": "online", 00:12:26.643 "raid_level": "raid0", 00:12:26.643 "superblock": true, 00:12:26.643 "num_base_bdevs": 3, 00:12:26.643 "num_base_bdevs_discovered": 3, 00:12:26.643 "num_base_bdevs_operational": 3, 00:12:26.643 "base_bdevs_list": [ 00:12:26.643 { 00:12:26.643 "name": "BaseBdev1", 00:12:26.643 "uuid": "55012a74-41f0-5ebb-bad5-6be012df55fc", 00:12:26.643 "is_configured": true, 00:12:26.643 "data_offset": 2048, 00:12:26.643 "data_size": 63488 00:12:26.643 }, 00:12:26.643 { 00:12:26.643 "name": "BaseBdev2", 00:12:26.643 "uuid": "74020783-2871-597b-adfa-e8b8b0945be3", 00:12:26.643 "is_configured": true, 00:12:26.643 "data_offset": 2048, 00:12:26.643 "data_size": 63488 00:12:26.643 }, 00:12:26.643 { 00:12:26.643 "name": "BaseBdev3", 00:12:26.643 "uuid": "7748bae0-bc47-57d8-a928-d56006a33d5b", 00:12:26.643 "is_configured": true, 00:12:26.643 "data_offset": 2048, 00:12:26.643 "data_size": 63488 00:12:26.643 } 00:12:26.643 ] 00:12:26.643 }' 00:12:26.643 13:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.643 13:40:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.213 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:27.213 13:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:27.214 [2024-06-10 13:40:41.491207] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14bdde0 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.156 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.417 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.417 "name": "raid_bdev1", 00:12:28.417 "uuid": "5125556f-fc81-4ca2-9511-beee53ada88d", 00:12:28.417 "strip_size_kb": 64, 00:12:28.417 "state": "online", 00:12:28.417 "raid_level": "raid0", 00:12:28.417 "superblock": true, 00:12:28.417 "num_base_bdevs": 3, 00:12:28.417 "num_base_bdevs_discovered": 3, 00:12:28.417 "num_base_bdevs_operational": 3, 00:12:28.417 "base_bdevs_list": [ 00:12:28.417 { 00:12:28.417 "name": "BaseBdev1", 00:12:28.417 "uuid": "55012a74-41f0-5ebb-bad5-6be012df55fc", 00:12:28.417 "is_configured": true, 00:12:28.417 "data_offset": 2048, 00:12:28.417 "data_size": 63488 00:12:28.417 }, 00:12:28.417 { 00:12:28.417 "name": "BaseBdev2", 00:12:28.417 "uuid": "74020783-2871-597b-adfa-e8b8b0945be3", 00:12:28.417 "is_configured": true, 00:12:28.417 "data_offset": 2048, 00:12:28.417 "data_size": 63488 00:12:28.417 }, 00:12:28.417 { 00:12:28.417 "name": "BaseBdev3", 00:12:28.417 "uuid": "7748bae0-bc47-57d8-a928-d56006a33d5b", 00:12:28.417 "is_configured": true, 00:12:28.417 "data_offset": 2048, 00:12:28.417 "data_size": 63488 00:12:28.417 } 00:12:28.417 ] 00:12:28.417 }' 00:12:28.417 13:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.417 13:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.988 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:29.249 [2024-06-10 13:40:43.523098] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:29.249 [2024-06-10 13:40:43.523130] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.249 [2024-06-10 13:40:43.525937] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.249 [2024-06-10 13:40:43.525963] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.249 [2024-06-10 13:40:43.525987] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.249 [2024-06-10 13:40:43.525993] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1963060 name raid_bdev1, state offline 00:12:29.249 0 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1527715 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1527715 ']' 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1527715 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1527715 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1527715' 00:12:29.249 killing process with pid 1527715 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1527715 00:12:29.249 [2024-06-10 13:40:43.591672] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:29.249 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1527715 00:12:29.249 [2024-06-10 13:40:43.602749] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.18y490mORy 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:12:29.511 00:12:29.511 real 0m5.980s 00:12:29.511 user 0m9.545s 00:12:29.511 sys 0m0.848s 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:29.511 13:40:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.511 ************************************ 00:12:29.511 END TEST raid_write_error_test 00:12:29.511 ************************************ 00:12:29.511 13:40:43 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:29.511 13:40:43 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:12:29.511 13:40:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:29.511 13:40:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:29.511 13:40:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:29.511 ************************************ 00:12:29.511 START TEST raid_state_function_test 00:12:29.511 ************************************ 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 false 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1529060 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1529060' 00:12:29.511 Process raid pid: 1529060 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1529060 /var/tmp/spdk-raid.sock 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1529060 ']' 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:29.511 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:29.511 13:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.511 [2024-06-10 13:40:43.882911] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:12:29.511 [2024-06-10 13:40:43.882968] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:29.511 [2024-06-10 13:40:43.977000] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:29.773 [2024-06-10 13:40:44.048211] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:12:29.773 [2024-06-10 13:40:44.097371] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:29.773 [2024-06-10 13:40:44.097395] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.345 13:40:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:30.345 13:40:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:12:30.345 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:30.606 [2024-06-10 13:40:44.925305] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:30.606 [2024-06-10 13:40:44.925340] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:30.606 [2024-06-10 13:40:44.925347] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:30.606 [2024-06-10 13:40:44.925354] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:30.606 [2024-06-10 13:40:44.925359] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:30.606 [2024-06-10 13:40:44.925364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.606 13:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.867 13:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.867 "name": "Existed_Raid", 00:12:30.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.867 "strip_size_kb": 64, 00:12:30.867 "state": "configuring", 00:12:30.867 "raid_level": "concat", 00:12:30.867 "superblock": false, 00:12:30.867 "num_base_bdevs": 3, 00:12:30.867 "num_base_bdevs_discovered": 0, 00:12:30.867 "num_base_bdevs_operational": 3, 00:12:30.867 "base_bdevs_list": [ 00:12:30.867 { 00:12:30.867 "name": "BaseBdev1", 00:12:30.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.867 "is_configured": false, 00:12:30.867 "data_offset": 0, 00:12:30.867 "data_size": 0 00:12:30.867 }, 00:12:30.867 { 00:12:30.867 "name": "BaseBdev2", 00:12:30.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.867 "is_configured": false, 00:12:30.867 "data_offset": 0, 00:12:30.867 "data_size": 0 00:12:30.867 }, 00:12:30.867 { 00:12:30.867 "name": "BaseBdev3", 00:12:30.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:30.867 "is_configured": false, 00:12:30.867 "data_offset": 0, 00:12:30.867 "data_size": 0 00:12:30.867 } 00:12:30.867 ] 00:12:30.867 }' 00:12:30.867 13:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.867 13:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.439 13:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:31.439 [2024-06-10 13:40:45.903658] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:31.439 [2024-06-10 13:40:45.903676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b1740 name Existed_Raid, state configuring 00:12:31.700 13:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:31.700 [2024-06-10 13:40:46.100175] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:31.700 [2024-06-10 13:40:46.100192] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:31.700 [2024-06-10 13:40:46.100197] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:31.700 [2024-06-10 13:40:46.100203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:31.700 [2024-06-10 13:40:46.100208] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:31.700 [2024-06-10 13:40:46.100214] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:31.700 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:31.961 [2024-06-10 13:40:46.299511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:31.961 BaseBdev1 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:31.961 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:32.221 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:32.221 [ 00:12:32.221 { 00:12:32.221 "name": "BaseBdev1", 00:12:32.221 "aliases": [ 00:12:32.221 "e25a5a96-6061-4aaf-9b6e-3b78e7c01926" 00:12:32.221 ], 00:12:32.221 "product_name": "Malloc disk", 00:12:32.221 "block_size": 512, 00:12:32.221 "num_blocks": 65536, 00:12:32.221 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:32.221 "assigned_rate_limits": { 00:12:32.221 "rw_ios_per_sec": 0, 00:12:32.221 "rw_mbytes_per_sec": 0, 00:12:32.221 "r_mbytes_per_sec": 0, 00:12:32.221 "w_mbytes_per_sec": 0 00:12:32.221 }, 00:12:32.221 "claimed": true, 00:12:32.221 "claim_type": "exclusive_write", 00:12:32.221 "zoned": false, 00:12:32.221 "supported_io_types": { 00:12:32.221 "read": true, 00:12:32.221 "write": true, 00:12:32.221 "unmap": true, 00:12:32.221 "write_zeroes": true, 00:12:32.222 "flush": true, 00:12:32.222 "reset": true, 00:12:32.222 "compare": false, 00:12:32.222 "compare_and_write": false, 00:12:32.222 "abort": true, 00:12:32.222 "nvme_admin": false, 00:12:32.222 "nvme_io": false 00:12:32.222 }, 00:12:32.222 "memory_domains": [ 00:12:32.222 { 00:12:32.222 "dma_device_id": "system", 00:12:32.222 "dma_device_type": 1 00:12:32.222 }, 00:12:32.222 { 00:12:32.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.222 "dma_device_type": 2 00:12:32.222 } 00:12:32.222 ], 00:12:32.222 "driver_specific": {} 00:12:32.222 } 00:12:32.222 ] 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.483 "name": "Existed_Raid", 00:12:32.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.483 "strip_size_kb": 64, 00:12:32.483 "state": "configuring", 00:12:32.483 "raid_level": "concat", 00:12:32.483 "superblock": false, 00:12:32.483 "num_base_bdevs": 3, 00:12:32.483 "num_base_bdevs_discovered": 1, 00:12:32.483 "num_base_bdevs_operational": 3, 00:12:32.483 "base_bdevs_list": [ 00:12:32.483 { 00:12:32.483 "name": "BaseBdev1", 00:12:32.483 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:32.483 "is_configured": true, 00:12:32.483 "data_offset": 0, 00:12:32.483 "data_size": 65536 00:12:32.483 }, 00:12:32.483 { 00:12:32.483 "name": "BaseBdev2", 00:12:32.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.483 "is_configured": false, 00:12:32.483 "data_offset": 0, 00:12:32.483 "data_size": 0 00:12:32.483 }, 00:12:32.483 { 00:12:32.483 "name": "BaseBdev3", 00:12:32.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.483 "is_configured": false, 00:12:32.483 "data_offset": 0, 00:12:32.483 "data_size": 0 00:12:32.483 } 00:12:32.483 ] 00:12:32.483 }' 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.483 13:40:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.055 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:33.315 [2024-06-10 13:40:47.646913] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:33.315 [2024-06-10 13:40:47.646938] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b1010 name Existed_Raid, state configuring 00:12:33.316 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:33.577 [2024-06-10 13:40:47.851466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:33.577 [2024-06-10 13:40:47.852672] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:33.577 [2024-06-10 13:40:47.852697] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:33.577 [2024-06-10 13:40:47.852702] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:33.577 [2024-06-10 13:40:47.852709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.577 13:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.838 13:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.838 "name": "Existed_Raid", 00:12:33.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.838 "strip_size_kb": 64, 00:12:33.838 "state": "configuring", 00:12:33.838 "raid_level": "concat", 00:12:33.838 "superblock": false, 00:12:33.838 "num_base_bdevs": 3, 00:12:33.838 "num_base_bdevs_discovered": 1, 00:12:33.839 "num_base_bdevs_operational": 3, 00:12:33.839 "base_bdevs_list": [ 00:12:33.839 { 00:12:33.839 "name": "BaseBdev1", 00:12:33.839 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:33.839 "is_configured": true, 00:12:33.839 "data_offset": 0, 00:12:33.839 "data_size": 65536 00:12:33.839 }, 00:12:33.839 { 00:12:33.839 "name": "BaseBdev2", 00:12:33.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.839 "is_configured": false, 00:12:33.839 "data_offset": 0, 00:12:33.839 "data_size": 0 00:12:33.839 }, 00:12:33.839 { 00:12:33.839 "name": "BaseBdev3", 00:12:33.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.839 "is_configured": false, 00:12:33.839 "data_offset": 0, 00:12:33.839 "data_size": 0 00:12:33.839 } 00:12:33.839 ] 00:12:33.839 }' 00:12:33.839 13:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.839 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:34.409 [2024-06-10 13:40:48.839063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:34.409 BaseBdev2 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:34.409 13:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:34.670 13:40:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:34.931 [ 00:12:34.931 { 00:12:34.931 "name": "BaseBdev2", 00:12:34.931 "aliases": [ 00:12:34.931 "ef2e1887-1c44-41ee-b651-f0b305db77e8" 00:12:34.931 ], 00:12:34.931 "product_name": "Malloc disk", 00:12:34.931 "block_size": 512, 00:12:34.931 "num_blocks": 65536, 00:12:34.931 "uuid": "ef2e1887-1c44-41ee-b651-f0b305db77e8", 00:12:34.931 "assigned_rate_limits": { 00:12:34.931 "rw_ios_per_sec": 0, 00:12:34.931 "rw_mbytes_per_sec": 0, 00:12:34.931 "r_mbytes_per_sec": 0, 00:12:34.931 "w_mbytes_per_sec": 0 00:12:34.931 }, 00:12:34.931 "claimed": true, 00:12:34.931 "claim_type": "exclusive_write", 00:12:34.931 "zoned": false, 00:12:34.931 "supported_io_types": { 00:12:34.931 "read": true, 00:12:34.931 "write": true, 00:12:34.931 "unmap": true, 00:12:34.931 "write_zeroes": true, 00:12:34.931 "flush": true, 00:12:34.931 "reset": true, 00:12:34.931 "compare": false, 00:12:34.931 "compare_and_write": false, 00:12:34.931 "abort": true, 00:12:34.931 "nvme_admin": false, 00:12:34.931 "nvme_io": false 00:12:34.931 }, 00:12:34.931 "memory_domains": [ 00:12:34.931 { 00:12:34.931 "dma_device_id": "system", 00:12:34.931 "dma_device_type": 1 00:12:34.931 }, 00:12:34.931 { 00:12:34.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.931 "dma_device_type": 2 00:12:34.931 } 00:12:34.931 ], 00:12:34.931 "driver_specific": {} 00:12:34.931 } 00:12:34.931 ] 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.931 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.193 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.193 "name": "Existed_Raid", 00:12:35.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.193 "strip_size_kb": 64, 00:12:35.193 "state": "configuring", 00:12:35.193 "raid_level": "concat", 00:12:35.193 "superblock": false, 00:12:35.193 "num_base_bdevs": 3, 00:12:35.193 "num_base_bdevs_discovered": 2, 00:12:35.193 "num_base_bdevs_operational": 3, 00:12:35.193 "base_bdevs_list": [ 00:12:35.193 { 00:12:35.193 "name": "BaseBdev1", 00:12:35.193 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:35.193 "is_configured": true, 00:12:35.193 "data_offset": 0, 00:12:35.193 "data_size": 65536 00:12:35.193 }, 00:12:35.193 { 00:12:35.193 "name": "BaseBdev2", 00:12:35.193 "uuid": "ef2e1887-1c44-41ee-b651-f0b305db77e8", 00:12:35.193 "is_configured": true, 00:12:35.193 "data_offset": 0, 00:12:35.193 "data_size": 65536 00:12:35.193 }, 00:12:35.193 { 00:12:35.193 "name": "BaseBdev3", 00:12:35.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.193 "is_configured": false, 00:12:35.193 "data_offset": 0, 00:12:35.193 "data_size": 0 00:12:35.193 } 00:12:35.193 ] 00:12:35.193 }' 00:12:35.193 13:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.193 13:40:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:35.765 [2024-06-10 13:40:50.203804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:35.765 [2024-06-10 13:40:50.203834] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24b1f00 00:12:35.765 [2024-06-10 13:40:50.203839] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:35.765 [2024-06-10 13:40:50.203997] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c8df0 00:12:35.765 [2024-06-10 13:40:50.204096] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24b1f00 00:12:35.765 [2024-06-10 13:40:50.204102] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24b1f00 00:12:35.765 [2024-06-10 13:40:50.204245] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:35.765 BaseBdev3 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:35.765 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:36.026 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:36.288 [ 00:12:36.288 { 00:12:36.288 "name": "BaseBdev3", 00:12:36.288 "aliases": [ 00:12:36.288 "8a5e7254-d83b-496b-ab5d-305cf1116903" 00:12:36.288 ], 00:12:36.288 "product_name": "Malloc disk", 00:12:36.288 "block_size": 512, 00:12:36.288 "num_blocks": 65536, 00:12:36.288 "uuid": "8a5e7254-d83b-496b-ab5d-305cf1116903", 00:12:36.288 "assigned_rate_limits": { 00:12:36.288 "rw_ios_per_sec": 0, 00:12:36.288 "rw_mbytes_per_sec": 0, 00:12:36.288 "r_mbytes_per_sec": 0, 00:12:36.288 "w_mbytes_per_sec": 0 00:12:36.288 }, 00:12:36.288 "claimed": true, 00:12:36.288 "claim_type": "exclusive_write", 00:12:36.288 "zoned": false, 00:12:36.288 "supported_io_types": { 00:12:36.288 "read": true, 00:12:36.288 "write": true, 00:12:36.288 "unmap": true, 00:12:36.288 "write_zeroes": true, 00:12:36.288 "flush": true, 00:12:36.288 "reset": true, 00:12:36.288 "compare": false, 00:12:36.288 "compare_and_write": false, 00:12:36.288 "abort": true, 00:12:36.288 "nvme_admin": false, 00:12:36.288 "nvme_io": false 00:12:36.288 }, 00:12:36.288 "memory_domains": [ 00:12:36.288 { 00:12:36.288 "dma_device_id": "system", 00:12:36.288 "dma_device_type": 1 00:12:36.288 }, 00:12:36.288 { 00:12:36.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.288 "dma_device_type": 2 00:12:36.288 } 00:12:36.288 ], 00:12:36.288 "driver_specific": {} 00:12:36.288 } 00:12:36.288 ] 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.288 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.550 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.550 "name": "Existed_Raid", 00:12:36.550 "uuid": "edca3402-2439-4870-893f-2ed5977f0208", 00:12:36.550 "strip_size_kb": 64, 00:12:36.550 "state": "online", 00:12:36.550 "raid_level": "concat", 00:12:36.550 "superblock": false, 00:12:36.550 "num_base_bdevs": 3, 00:12:36.550 "num_base_bdevs_discovered": 3, 00:12:36.550 "num_base_bdevs_operational": 3, 00:12:36.550 "base_bdevs_list": [ 00:12:36.550 { 00:12:36.550 "name": "BaseBdev1", 00:12:36.550 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:36.550 "is_configured": true, 00:12:36.550 "data_offset": 0, 00:12:36.550 "data_size": 65536 00:12:36.550 }, 00:12:36.550 { 00:12:36.550 "name": "BaseBdev2", 00:12:36.550 "uuid": "ef2e1887-1c44-41ee-b651-f0b305db77e8", 00:12:36.550 "is_configured": true, 00:12:36.550 "data_offset": 0, 00:12:36.550 "data_size": 65536 00:12:36.550 }, 00:12:36.550 { 00:12:36.550 "name": "BaseBdev3", 00:12:36.550 "uuid": "8a5e7254-d83b-496b-ab5d-305cf1116903", 00:12:36.550 "is_configured": true, 00:12:36.550 "data_offset": 0, 00:12:36.550 "data_size": 65536 00:12:36.550 } 00:12:36.550 ] 00:12:36.550 }' 00:12:36.550 13:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.550 13:40:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:37.122 [2024-06-10 13:40:51.559475] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:37.122 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:37.122 "name": "Existed_Raid", 00:12:37.122 "aliases": [ 00:12:37.122 "edca3402-2439-4870-893f-2ed5977f0208" 00:12:37.122 ], 00:12:37.122 "product_name": "Raid Volume", 00:12:37.122 "block_size": 512, 00:12:37.122 "num_blocks": 196608, 00:12:37.122 "uuid": "edca3402-2439-4870-893f-2ed5977f0208", 00:12:37.122 "assigned_rate_limits": { 00:12:37.122 "rw_ios_per_sec": 0, 00:12:37.122 "rw_mbytes_per_sec": 0, 00:12:37.122 "r_mbytes_per_sec": 0, 00:12:37.123 "w_mbytes_per_sec": 0 00:12:37.123 }, 00:12:37.123 "claimed": false, 00:12:37.123 "zoned": false, 00:12:37.123 "supported_io_types": { 00:12:37.123 "read": true, 00:12:37.123 "write": true, 00:12:37.123 "unmap": true, 00:12:37.123 "write_zeroes": true, 00:12:37.123 "flush": true, 00:12:37.123 "reset": true, 00:12:37.123 "compare": false, 00:12:37.123 "compare_and_write": false, 00:12:37.123 "abort": false, 00:12:37.123 "nvme_admin": false, 00:12:37.123 "nvme_io": false 00:12:37.123 }, 00:12:37.123 "memory_domains": [ 00:12:37.123 { 00:12:37.123 "dma_device_id": "system", 00:12:37.123 "dma_device_type": 1 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.123 "dma_device_type": 2 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "dma_device_id": "system", 00:12:37.123 "dma_device_type": 1 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.123 "dma_device_type": 2 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "dma_device_id": "system", 00:12:37.123 "dma_device_type": 1 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.123 "dma_device_type": 2 00:12:37.123 } 00:12:37.123 ], 00:12:37.123 "driver_specific": { 00:12:37.123 "raid": { 00:12:37.123 "uuid": "edca3402-2439-4870-893f-2ed5977f0208", 00:12:37.123 "strip_size_kb": 64, 00:12:37.123 "state": "online", 00:12:37.123 "raid_level": "concat", 00:12:37.123 "superblock": false, 00:12:37.123 "num_base_bdevs": 3, 00:12:37.123 "num_base_bdevs_discovered": 3, 00:12:37.123 "num_base_bdevs_operational": 3, 00:12:37.123 "base_bdevs_list": [ 00:12:37.123 { 00:12:37.123 "name": "BaseBdev1", 00:12:37.123 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:37.123 "is_configured": true, 00:12:37.123 "data_offset": 0, 00:12:37.123 "data_size": 65536 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "name": "BaseBdev2", 00:12:37.123 "uuid": "ef2e1887-1c44-41ee-b651-f0b305db77e8", 00:12:37.123 "is_configured": true, 00:12:37.123 "data_offset": 0, 00:12:37.123 "data_size": 65536 00:12:37.123 }, 00:12:37.123 { 00:12:37.123 "name": "BaseBdev3", 00:12:37.123 "uuid": "8a5e7254-d83b-496b-ab5d-305cf1116903", 00:12:37.123 "is_configured": true, 00:12:37.123 "data_offset": 0, 00:12:37.123 "data_size": 65536 00:12:37.123 } 00:12:37.123 ] 00:12:37.123 } 00:12:37.123 } 00:12:37.123 }' 00:12:37.123 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:37.404 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:37.404 BaseBdev2 00:12:37.404 BaseBdev3' 00:12:37.404 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.404 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:37.404 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.404 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.404 "name": "BaseBdev1", 00:12:37.404 "aliases": [ 00:12:37.404 "e25a5a96-6061-4aaf-9b6e-3b78e7c01926" 00:12:37.404 ], 00:12:37.404 "product_name": "Malloc disk", 00:12:37.404 "block_size": 512, 00:12:37.404 "num_blocks": 65536, 00:12:37.404 "uuid": "e25a5a96-6061-4aaf-9b6e-3b78e7c01926", 00:12:37.404 "assigned_rate_limits": { 00:12:37.404 "rw_ios_per_sec": 0, 00:12:37.404 "rw_mbytes_per_sec": 0, 00:12:37.404 "r_mbytes_per_sec": 0, 00:12:37.404 "w_mbytes_per_sec": 0 00:12:37.404 }, 00:12:37.404 "claimed": true, 00:12:37.404 "claim_type": "exclusive_write", 00:12:37.404 "zoned": false, 00:12:37.404 "supported_io_types": { 00:12:37.404 "read": true, 00:12:37.404 "write": true, 00:12:37.404 "unmap": true, 00:12:37.404 "write_zeroes": true, 00:12:37.404 "flush": true, 00:12:37.405 "reset": true, 00:12:37.405 "compare": false, 00:12:37.405 "compare_and_write": false, 00:12:37.405 "abort": true, 00:12:37.405 "nvme_admin": false, 00:12:37.405 "nvme_io": false 00:12:37.405 }, 00:12:37.405 "memory_domains": [ 00:12:37.405 { 00:12:37.405 "dma_device_id": "system", 00:12:37.405 "dma_device_type": 1 00:12:37.405 }, 00:12:37.405 { 00:12:37.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.405 "dma_device_type": 2 00:12:37.405 } 00:12:37.405 ], 00:12:37.405 "driver_specific": {} 00:12:37.405 }' 00:12:37.405 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.405 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.697 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.697 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.697 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.697 13:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:37.697 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.697 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.697 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:37.697 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.697 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.988 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:37.988 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.988 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.988 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:37.988 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.988 "name": "BaseBdev2", 00:12:37.988 "aliases": [ 00:12:37.988 "ef2e1887-1c44-41ee-b651-f0b305db77e8" 00:12:37.988 ], 00:12:37.988 "product_name": "Malloc disk", 00:12:37.988 "block_size": 512, 00:12:37.989 "num_blocks": 65536, 00:12:37.989 "uuid": "ef2e1887-1c44-41ee-b651-f0b305db77e8", 00:12:37.989 "assigned_rate_limits": { 00:12:37.989 "rw_ios_per_sec": 0, 00:12:37.989 "rw_mbytes_per_sec": 0, 00:12:37.989 "r_mbytes_per_sec": 0, 00:12:37.989 "w_mbytes_per_sec": 0 00:12:37.989 }, 00:12:37.989 "claimed": true, 00:12:37.989 "claim_type": "exclusive_write", 00:12:37.989 "zoned": false, 00:12:37.989 "supported_io_types": { 00:12:37.989 "read": true, 00:12:37.989 "write": true, 00:12:37.989 "unmap": true, 00:12:37.989 "write_zeroes": true, 00:12:37.989 "flush": true, 00:12:37.989 "reset": true, 00:12:37.989 "compare": false, 00:12:37.989 "compare_and_write": false, 00:12:37.989 "abort": true, 00:12:37.989 "nvme_admin": false, 00:12:37.989 "nvme_io": false 00:12:37.989 }, 00:12:37.989 "memory_domains": [ 00:12:37.989 { 00:12:37.989 "dma_device_id": "system", 00:12:37.989 "dma_device_type": 1 00:12:37.989 }, 00:12:37.989 { 00:12:37.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.989 "dma_device_type": 2 00:12:37.989 } 00:12:37.989 ], 00:12:37.989 "driver_specific": {} 00:12:37.989 }' 00:12:37.989 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.989 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.989 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.989 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.249 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.249 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.249 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.249 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.249 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.250 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.250 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.250 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.250 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.250 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:38.250 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.511 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.511 "name": "BaseBdev3", 00:12:38.511 "aliases": [ 00:12:38.511 "8a5e7254-d83b-496b-ab5d-305cf1116903" 00:12:38.511 ], 00:12:38.511 "product_name": "Malloc disk", 00:12:38.511 "block_size": 512, 00:12:38.511 "num_blocks": 65536, 00:12:38.511 "uuid": "8a5e7254-d83b-496b-ab5d-305cf1116903", 00:12:38.511 "assigned_rate_limits": { 00:12:38.511 "rw_ios_per_sec": 0, 00:12:38.511 "rw_mbytes_per_sec": 0, 00:12:38.511 "r_mbytes_per_sec": 0, 00:12:38.511 "w_mbytes_per_sec": 0 00:12:38.511 }, 00:12:38.511 "claimed": true, 00:12:38.511 "claim_type": "exclusive_write", 00:12:38.511 "zoned": false, 00:12:38.511 "supported_io_types": { 00:12:38.511 "read": true, 00:12:38.511 "write": true, 00:12:38.511 "unmap": true, 00:12:38.511 "write_zeroes": true, 00:12:38.511 "flush": true, 00:12:38.511 "reset": true, 00:12:38.511 "compare": false, 00:12:38.511 "compare_and_write": false, 00:12:38.511 "abort": true, 00:12:38.511 "nvme_admin": false, 00:12:38.511 "nvme_io": false 00:12:38.511 }, 00:12:38.511 "memory_domains": [ 00:12:38.511 { 00:12:38.511 "dma_device_id": "system", 00:12:38.511 "dma_device_type": 1 00:12:38.511 }, 00:12:38.511 { 00:12:38.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.511 "dma_device_type": 2 00:12:38.511 } 00:12:38.511 ], 00:12:38.511 "driver_specific": {} 00:12:38.511 }' 00:12:38.511 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.511 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.771 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.771 13:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.771 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.772 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:39.033 [2024-06-10 13:40:53.412023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:39.033 [2024-06-10 13:40:53.412041] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.033 [2024-06-10 13:40:53.412075] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.033 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.296 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.296 "name": "Existed_Raid", 00:12:39.296 "uuid": "edca3402-2439-4870-893f-2ed5977f0208", 00:12:39.296 "strip_size_kb": 64, 00:12:39.296 "state": "offline", 00:12:39.296 "raid_level": "concat", 00:12:39.296 "superblock": false, 00:12:39.296 "num_base_bdevs": 3, 00:12:39.296 "num_base_bdevs_discovered": 2, 00:12:39.296 "num_base_bdevs_operational": 2, 00:12:39.296 "base_bdevs_list": [ 00:12:39.296 { 00:12:39.296 "name": null, 00:12:39.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.296 "is_configured": false, 00:12:39.296 "data_offset": 0, 00:12:39.296 "data_size": 65536 00:12:39.296 }, 00:12:39.296 { 00:12:39.296 "name": "BaseBdev2", 00:12:39.296 "uuid": "ef2e1887-1c44-41ee-b651-f0b305db77e8", 00:12:39.296 "is_configured": true, 00:12:39.296 "data_offset": 0, 00:12:39.296 "data_size": 65536 00:12:39.296 }, 00:12:39.296 { 00:12:39.296 "name": "BaseBdev3", 00:12:39.296 "uuid": "8a5e7254-d83b-496b-ab5d-305cf1116903", 00:12:39.296 "is_configured": true, 00:12:39.296 "data_offset": 0, 00:12:39.296 "data_size": 65536 00:12:39.296 } 00:12:39.296 ] 00:12:39.296 }' 00:12:39.296 13:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.296 13:40:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.868 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:39.868 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:39.868 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.868 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:40.129 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:40.129 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:40.129 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:40.129 [2024-06-10 13:40:54.591029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:40.389 13:40:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:40.650 [2024-06-10 13:40:54.986127] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:40.650 [2024-06-10 13:40:54.986159] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b1f00 name Existed_Raid, state offline 00:12:40.650 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:40.650 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:40.650 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.650 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:40.911 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:40.911 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:40.911 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:40.911 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:40.911 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:40.911 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:41.172 BaseBdev2 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:41.172 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:41.433 [ 00:12:41.433 { 00:12:41.433 "name": "BaseBdev2", 00:12:41.433 "aliases": [ 00:12:41.433 "54c5c148-6021-4ce9-9ed9-97ca2eeb7106" 00:12:41.433 ], 00:12:41.433 "product_name": "Malloc disk", 00:12:41.433 "block_size": 512, 00:12:41.433 "num_blocks": 65536, 00:12:41.433 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:41.433 "assigned_rate_limits": { 00:12:41.433 "rw_ios_per_sec": 0, 00:12:41.433 "rw_mbytes_per_sec": 0, 00:12:41.433 "r_mbytes_per_sec": 0, 00:12:41.433 "w_mbytes_per_sec": 0 00:12:41.433 }, 00:12:41.433 "claimed": false, 00:12:41.433 "zoned": false, 00:12:41.433 "supported_io_types": { 00:12:41.433 "read": true, 00:12:41.433 "write": true, 00:12:41.433 "unmap": true, 00:12:41.433 "write_zeroes": true, 00:12:41.433 "flush": true, 00:12:41.433 "reset": true, 00:12:41.433 "compare": false, 00:12:41.433 "compare_and_write": false, 00:12:41.433 "abort": true, 00:12:41.433 "nvme_admin": false, 00:12:41.433 "nvme_io": false 00:12:41.433 }, 00:12:41.433 "memory_domains": [ 00:12:41.433 { 00:12:41.433 "dma_device_id": "system", 00:12:41.433 "dma_device_type": 1 00:12:41.433 }, 00:12:41.433 { 00:12:41.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.433 "dma_device_type": 2 00:12:41.433 } 00:12:41.433 ], 00:12:41.433 "driver_specific": {} 00:12:41.433 } 00:12:41.433 ] 00:12:41.433 13:40:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:41.433 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:41.433 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:41.433 13:40:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:41.694 BaseBdev3 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:41.694 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:41.955 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:41.955 [ 00:12:41.955 { 00:12:41.955 "name": "BaseBdev3", 00:12:41.955 "aliases": [ 00:12:41.955 "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0" 00:12:41.955 ], 00:12:41.955 "product_name": "Malloc disk", 00:12:41.955 "block_size": 512, 00:12:41.955 "num_blocks": 65536, 00:12:41.955 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:41.955 "assigned_rate_limits": { 00:12:41.955 "rw_ios_per_sec": 0, 00:12:41.955 "rw_mbytes_per_sec": 0, 00:12:41.955 "r_mbytes_per_sec": 0, 00:12:41.955 "w_mbytes_per_sec": 0 00:12:41.955 }, 00:12:41.955 "claimed": false, 00:12:41.955 "zoned": false, 00:12:41.955 "supported_io_types": { 00:12:41.955 "read": true, 00:12:41.955 "write": true, 00:12:41.955 "unmap": true, 00:12:41.955 "write_zeroes": true, 00:12:41.955 "flush": true, 00:12:41.955 "reset": true, 00:12:41.955 "compare": false, 00:12:41.955 "compare_and_write": false, 00:12:41.955 "abort": true, 00:12:41.955 "nvme_admin": false, 00:12:41.955 "nvme_io": false 00:12:41.955 }, 00:12:41.955 "memory_domains": [ 00:12:41.955 { 00:12:41.955 "dma_device_id": "system", 00:12:41.955 "dma_device_type": 1 00:12:41.955 }, 00:12:41.955 { 00:12:41.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.955 "dma_device_type": 2 00:12:41.955 } 00:12:41.955 ], 00:12:41.955 "driver_specific": {} 00:12:41.955 } 00:12:41.955 ] 00:12:41.955 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:41.955 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:41.955 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:41.955 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:42.216 [2024-06-10 13:40:56.574407] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:42.216 [2024-06-10 13:40:56.574439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:42.216 [2024-06-10 13:40:56.574453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:42.216 [2024-06-10 13:40:56.575543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.216 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.477 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.477 "name": "Existed_Raid", 00:12:42.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.477 "strip_size_kb": 64, 00:12:42.477 "state": "configuring", 00:12:42.477 "raid_level": "concat", 00:12:42.477 "superblock": false, 00:12:42.477 "num_base_bdevs": 3, 00:12:42.477 "num_base_bdevs_discovered": 2, 00:12:42.477 "num_base_bdevs_operational": 3, 00:12:42.477 "base_bdevs_list": [ 00:12:42.477 { 00:12:42.477 "name": "BaseBdev1", 00:12:42.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.477 "is_configured": false, 00:12:42.477 "data_offset": 0, 00:12:42.477 "data_size": 0 00:12:42.477 }, 00:12:42.477 { 00:12:42.477 "name": "BaseBdev2", 00:12:42.477 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:42.477 "is_configured": true, 00:12:42.477 "data_offset": 0, 00:12:42.477 "data_size": 65536 00:12:42.477 }, 00:12:42.477 { 00:12:42.477 "name": "BaseBdev3", 00:12:42.477 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:42.477 "is_configured": true, 00:12:42.477 "data_offset": 0, 00:12:42.477 "data_size": 65536 00:12:42.477 } 00:12:42.477 ] 00:12:42.477 }' 00:12:42.477 13:40:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.477 13:40:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.048 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:43.048 [2024-06-10 13:40:57.508688] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.309 "name": "Existed_Raid", 00:12:43.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.309 "strip_size_kb": 64, 00:12:43.309 "state": "configuring", 00:12:43.309 "raid_level": "concat", 00:12:43.309 "superblock": false, 00:12:43.309 "num_base_bdevs": 3, 00:12:43.309 "num_base_bdevs_discovered": 1, 00:12:43.309 "num_base_bdevs_operational": 3, 00:12:43.309 "base_bdevs_list": [ 00:12:43.309 { 00:12:43.309 "name": "BaseBdev1", 00:12:43.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.309 "is_configured": false, 00:12:43.309 "data_offset": 0, 00:12:43.309 "data_size": 0 00:12:43.309 }, 00:12:43.309 { 00:12:43.309 "name": null, 00:12:43.309 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:43.309 "is_configured": false, 00:12:43.309 "data_offset": 0, 00:12:43.309 "data_size": 65536 00:12:43.309 }, 00:12:43.309 { 00:12:43.309 "name": "BaseBdev3", 00:12:43.309 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:43.309 "is_configured": true, 00:12:43.309 "data_offset": 0, 00:12:43.309 "data_size": 65536 00:12:43.309 } 00:12:43.309 ] 00:12:43.309 }' 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.309 13:40:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.881 13:40:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.881 13:40:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:44.142 13:40:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:44.142 13:40:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:44.403 [2024-06-10 13:40:58.688264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:44.403 BaseBdev1 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:44.403 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:44.665 13:40:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:44.665 [ 00:12:44.665 { 00:12:44.665 "name": "BaseBdev1", 00:12:44.665 "aliases": [ 00:12:44.665 "5b08a532-105e-4c9c-9782-09de2e0bd7b5" 00:12:44.665 ], 00:12:44.665 "product_name": "Malloc disk", 00:12:44.665 "block_size": 512, 00:12:44.665 "num_blocks": 65536, 00:12:44.665 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:44.665 "assigned_rate_limits": { 00:12:44.665 "rw_ios_per_sec": 0, 00:12:44.665 "rw_mbytes_per_sec": 0, 00:12:44.665 "r_mbytes_per_sec": 0, 00:12:44.665 "w_mbytes_per_sec": 0 00:12:44.665 }, 00:12:44.665 "claimed": true, 00:12:44.665 "claim_type": "exclusive_write", 00:12:44.665 "zoned": false, 00:12:44.665 "supported_io_types": { 00:12:44.665 "read": true, 00:12:44.665 "write": true, 00:12:44.665 "unmap": true, 00:12:44.665 "write_zeroes": true, 00:12:44.665 "flush": true, 00:12:44.665 "reset": true, 00:12:44.665 "compare": false, 00:12:44.665 "compare_and_write": false, 00:12:44.665 "abort": true, 00:12:44.665 "nvme_admin": false, 00:12:44.665 "nvme_io": false 00:12:44.665 }, 00:12:44.665 "memory_domains": [ 00:12:44.665 { 00:12:44.665 "dma_device_id": "system", 00:12:44.665 "dma_device_type": 1 00:12:44.665 }, 00:12:44.665 { 00:12:44.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.665 "dma_device_type": 2 00:12:44.665 } 00:12:44.665 ], 00:12:44.665 "driver_specific": {} 00:12:44.665 } 00:12:44.665 ] 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.665 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.926 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.926 "name": "Existed_Raid", 00:12:44.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.926 "strip_size_kb": 64, 00:12:44.926 "state": "configuring", 00:12:44.926 "raid_level": "concat", 00:12:44.926 "superblock": false, 00:12:44.926 "num_base_bdevs": 3, 00:12:44.926 "num_base_bdevs_discovered": 2, 00:12:44.926 "num_base_bdevs_operational": 3, 00:12:44.926 "base_bdevs_list": [ 00:12:44.926 { 00:12:44.926 "name": "BaseBdev1", 00:12:44.926 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:44.926 "is_configured": true, 00:12:44.926 "data_offset": 0, 00:12:44.926 "data_size": 65536 00:12:44.926 }, 00:12:44.926 { 00:12:44.926 "name": null, 00:12:44.926 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:44.926 "is_configured": false, 00:12:44.926 "data_offset": 0, 00:12:44.926 "data_size": 65536 00:12:44.926 }, 00:12:44.926 { 00:12:44.926 "name": "BaseBdev3", 00:12:44.926 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:44.926 "is_configured": true, 00:12:44.926 "data_offset": 0, 00:12:44.926 "data_size": 65536 00:12:44.926 } 00:12:44.926 ] 00:12:44.926 }' 00:12:44.926 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.926 13:40:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.498 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.498 13:40:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:45.758 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:45.758 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:46.019 [2024-06-10 13:41:00.236116] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.019 "name": "Existed_Raid", 00:12:46.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.019 "strip_size_kb": 64, 00:12:46.019 "state": "configuring", 00:12:46.019 "raid_level": "concat", 00:12:46.019 "superblock": false, 00:12:46.019 "num_base_bdevs": 3, 00:12:46.019 "num_base_bdevs_discovered": 1, 00:12:46.019 "num_base_bdevs_operational": 3, 00:12:46.019 "base_bdevs_list": [ 00:12:46.019 { 00:12:46.019 "name": "BaseBdev1", 00:12:46.019 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:46.019 "is_configured": true, 00:12:46.019 "data_offset": 0, 00:12:46.019 "data_size": 65536 00:12:46.019 }, 00:12:46.019 { 00:12:46.019 "name": null, 00:12:46.019 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:46.019 "is_configured": false, 00:12:46.019 "data_offset": 0, 00:12:46.019 "data_size": 65536 00:12:46.019 }, 00:12:46.019 { 00:12:46.019 "name": null, 00:12:46.019 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:46.019 "is_configured": false, 00:12:46.019 "data_offset": 0, 00:12:46.019 "data_size": 65536 00:12:46.019 } 00:12:46.019 ] 00:12:46.019 }' 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.019 13:41:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.590 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.590 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:46.851 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:46.851 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:47.111 [2024-06-10 13:41:01.443109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.111 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.112 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.112 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.112 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.112 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.373 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.373 "name": "Existed_Raid", 00:12:47.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.373 "strip_size_kb": 64, 00:12:47.373 "state": "configuring", 00:12:47.373 "raid_level": "concat", 00:12:47.373 "superblock": false, 00:12:47.373 "num_base_bdevs": 3, 00:12:47.373 "num_base_bdevs_discovered": 2, 00:12:47.373 "num_base_bdevs_operational": 3, 00:12:47.373 "base_bdevs_list": [ 00:12:47.373 { 00:12:47.373 "name": "BaseBdev1", 00:12:47.373 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:47.373 "is_configured": true, 00:12:47.373 "data_offset": 0, 00:12:47.373 "data_size": 65536 00:12:47.373 }, 00:12:47.373 { 00:12:47.373 "name": null, 00:12:47.373 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:47.373 "is_configured": false, 00:12:47.373 "data_offset": 0, 00:12:47.373 "data_size": 65536 00:12:47.373 }, 00:12:47.373 { 00:12:47.373 "name": "BaseBdev3", 00:12:47.373 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:47.373 "is_configured": true, 00:12:47.373 "data_offset": 0, 00:12:47.373 "data_size": 65536 00:12:47.373 } 00:12:47.373 ] 00:12:47.373 }' 00:12:47.373 13:41:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.373 13:41:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.944 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.944 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:48.205 [2024-06-10 13:41:02.650105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.205 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.466 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.466 "name": "Existed_Raid", 00:12:48.466 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.466 "strip_size_kb": 64, 00:12:48.466 "state": "configuring", 00:12:48.466 "raid_level": "concat", 00:12:48.466 "superblock": false, 00:12:48.466 "num_base_bdevs": 3, 00:12:48.466 "num_base_bdevs_discovered": 1, 00:12:48.466 "num_base_bdevs_operational": 3, 00:12:48.466 "base_bdevs_list": [ 00:12:48.466 { 00:12:48.466 "name": null, 00:12:48.466 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:48.466 "is_configured": false, 00:12:48.466 "data_offset": 0, 00:12:48.466 "data_size": 65536 00:12:48.466 }, 00:12:48.466 { 00:12:48.466 "name": null, 00:12:48.466 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:48.466 "is_configured": false, 00:12:48.466 "data_offset": 0, 00:12:48.466 "data_size": 65536 00:12:48.466 }, 00:12:48.466 { 00:12:48.466 "name": "BaseBdev3", 00:12:48.466 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:48.466 "is_configured": true, 00:12:48.466 "data_offset": 0, 00:12:48.466 "data_size": 65536 00:12:48.466 } 00:12:48.466 ] 00:12:48.466 }' 00:12:48.466 13:41:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.466 13:41:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.037 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.037 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:49.298 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:49.298 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:49.558 [2024-06-10 13:41:03.870499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:49.558 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:49.558 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.558 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.559 13:41:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.820 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.820 "name": "Existed_Raid", 00:12:49.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.820 "strip_size_kb": 64, 00:12:49.820 "state": "configuring", 00:12:49.820 "raid_level": "concat", 00:12:49.820 "superblock": false, 00:12:49.820 "num_base_bdevs": 3, 00:12:49.820 "num_base_bdevs_discovered": 2, 00:12:49.820 "num_base_bdevs_operational": 3, 00:12:49.820 "base_bdevs_list": [ 00:12:49.820 { 00:12:49.820 "name": null, 00:12:49.820 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:49.820 "is_configured": false, 00:12:49.820 "data_offset": 0, 00:12:49.820 "data_size": 65536 00:12:49.820 }, 00:12:49.820 { 00:12:49.820 "name": "BaseBdev2", 00:12:49.820 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:49.820 "is_configured": true, 00:12:49.820 "data_offset": 0, 00:12:49.820 "data_size": 65536 00:12:49.820 }, 00:12:49.820 { 00:12:49.820 "name": "BaseBdev3", 00:12:49.820 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:49.820 "is_configured": true, 00:12:49.820 "data_offset": 0, 00:12:49.820 "data_size": 65536 00:12:49.820 } 00:12:49.820 ] 00:12:49.820 }' 00:12:49.820 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.820 13:41:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.391 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.392 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:50.652 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:50.652 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.652 13:41:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:50.652 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5b08a532-105e-4c9c-9782-09de2e0bd7b5 00:12:50.912 [2024-06-10 13:41:05.278493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:50.912 [2024-06-10 13:41:05.278513] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24b3090 00:12:50.912 [2024-06-10 13:41:05.278516] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:50.912 [2024-06-10 13:41:05.278630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24b1710 00:12:50.912 [2024-06-10 13:41:05.278697] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24b3090 00:12:50.912 [2024-06-10 13:41:05.278702] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24b3090 00:12:50.912 [2024-06-10 13:41:05.278795] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:50.912 NewBaseBdev 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:50.912 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:51.173 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:51.433 [ 00:12:51.433 { 00:12:51.433 "name": "NewBaseBdev", 00:12:51.433 "aliases": [ 00:12:51.433 "5b08a532-105e-4c9c-9782-09de2e0bd7b5" 00:12:51.433 ], 00:12:51.433 "product_name": "Malloc disk", 00:12:51.433 "block_size": 512, 00:12:51.433 "num_blocks": 65536, 00:12:51.433 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:51.433 "assigned_rate_limits": { 00:12:51.433 "rw_ios_per_sec": 0, 00:12:51.433 "rw_mbytes_per_sec": 0, 00:12:51.433 "r_mbytes_per_sec": 0, 00:12:51.433 "w_mbytes_per_sec": 0 00:12:51.433 }, 00:12:51.433 "claimed": true, 00:12:51.433 "claim_type": "exclusive_write", 00:12:51.433 "zoned": false, 00:12:51.433 "supported_io_types": { 00:12:51.433 "read": true, 00:12:51.433 "write": true, 00:12:51.433 "unmap": true, 00:12:51.433 "write_zeroes": true, 00:12:51.433 "flush": true, 00:12:51.433 "reset": true, 00:12:51.433 "compare": false, 00:12:51.433 "compare_and_write": false, 00:12:51.433 "abort": true, 00:12:51.433 "nvme_admin": false, 00:12:51.433 "nvme_io": false 00:12:51.433 }, 00:12:51.433 "memory_domains": [ 00:12:51.433 { 00:12:51.433 "dma_device_id": "system", 00:12:51.433 "dma_device_type": 1 00:12:51.433 }, 00:12:51.433 { 00:12:51.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.433 "dma_device_type": 2 00:12:51.433 } 00:12:51.433 ], 00:12:51.433 "driver_specific": {} 00:12:51.433 } 00:12:51.433 ] 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.433 "name": "Existed_Raid", 00:12:51.433 "uuid": "42f5f174-ca1e-4b89-92cf-70abfe25d7e1", 00:12:51.433 "strip_size_kb": 64, 00:12:51.433 "state": "online", 00:12:51.433 "raid_level": "concat", 00:12:51.433 "superblock": false, 00:12:51.433 "num_base_bdevs": 3, 00:12:51.433 "num_base_bdevs_discovered": 3, 00:12:51.433 "num_base_bdevs_operational": 3, 00:12:51.433 "base_bdevs_list": [ 00:12:51.433 { 00:12:51.433 "name": "NewBaseBdev", 00:12:51.433 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:51.433 "is_configured": true, 00:12:51.433 "data_offset": 0, 00:12:51.433 "data_size": 65536 00:12:51.433 }, 00:12:51.433 { 00:12:51.433 "name": "BaseBdev2", 00:12:51.433 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:51.433 "is_configured": true, 00:12:51.433 "data_offset": 0, 00:12:51.433 "data_size": 65536 00:12:51.433 }, 00:12:51.433 { 00:12:51.433 "name": "BaseBdev3", 00:12:51.433 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:51.433 "is_configured": true, 00:12:51.433 "data_offset": 0, 00:12:51.433 "data_size": 65536 00:12:51.433 } 00:12:51.433 ] 00:12:51.433 }' 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.433 13:41:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:52.004 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:52.264 [2024-06-10 13:41:06.601932] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:52.264 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:52.264 "name": "Existed_Raid", 00:12:52.264 "aliases": [ 00:12:52.264 "42f5f174-ca1e-4b89-92cf-70abfe25d7e1" 00:12:52.264 ], 00:12:52.264 "product_name": "Raid Volume", 00:12:52.264 "block_size": 512, 00:12:52.264 "num_blocks": 196608, 00:12:52.264 "uuid": "42f5f174-ca1e-4b89-92cf-70abfe25d7e1", 00:12:52.264 "assigned_rate_limits": { 00:12:52.264 "rw_ios_per_sec": 0, 00:12:52.264 "rw_mbytes_per_sec": 0, 00:12:52.264 "r_mbytes_per_sec": 0, 00:12:52.264 "w_mbytes_per_sec": 0 00:12:52.264 }, 00:12:52.264 "claimed": false, 00:12:52.264 "zoned": false, 00:12:52.264 "supported_io_types": { 00:12:52.264 "read": true, 00:12:52.264 "write": true, 00:12:52.264 "unmap": true, 00:12:52.264 "write_zeroes": true, 00:12:52.264 "flush": true, 00:12:52.264 "reset": true, 00:12:52.264 "compare": false, 00:12:52.264 "compare_and_write": false, 00:12:52.264 "abort": false, 00:12:52.264 "nvme_admin": false, 00:12:52.264 "nvme_io": false 00:12:52.264 }, 00:12:52.264 "memory_domains": [ 00:12:52.264 { 00:12:52.264 "dma_device_id": "system", 00:12:52.264 "dma_device_type": 1 00:12:52.264 }, 00:12:52.264 { 00:12:52.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.264 "dma_device_type": 2 00:12:52.264 }, 00:12:52.264 { 00:12:52.264 "dma_device_id": "system", 00:12:52.264 "dma_device_type": 1 00:12:52.264 }, 00:12:52.264 { 00:12:52.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.264 "dma_device_type": 2 00:12:52.264 }, 00:12:52.264 { 00:12:52.264 "dma_device_id": "system", 00:12:52.264 "dma_device_type": 1 00:12:52.264 }, 00:12:52.264 { 00:12:52.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.264 "dma_device_type": 2 00:12:52.264 } 00:12:52.264 ], 00:12:52.265 "driver_specific": { 00:12:52.265 "raid": { 00:12:52.265 "uuid": "42f5f174-ca1e-4b89-92cf-70abfe25d7e1", 00:12:52.265 "strip_size_kb": 64, 00:12:52.265 "state": "online", 00:12:52.265 "raid_level": "concat", 00:12:52.265 "superblock": false, 00:12:52.265 "num_base_bdevs": 3, 00:12:52.265 "num_base_bdevs_discovered": 3, 00:12:52.265 "num_base_bdevs_operational": 3, 00:12:52.265 "base_bdevs_list": [ 00:12:52.265 { 00:12:52.265 "name": "NewBaseBdev", 00:12:52.265 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:52.265 "is_configured": true, 00:12:52.265 "data_offset": 0, 00:12:52.265 "data_size": 65536 00:12:52.265 }, 00:12:52.265 { 00:12:52.265 "name": "BaseBdev2", 00:12:52.265 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:52.265 "is_configured": true, 00:12:52.265 "data_offset": 0, 00:12:52.265 "data_size": 65536 00:12:52.265 }, 00:12:52.265 { 00:12:52.265 "name": "BaseBdev3", 00:12:52.265 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:52.265 "is_configured": true, 00:12:52.265 "data_offset": 0, 00:12:52.265 "data_size": 65536 00:12:52.265 } 00:12:52.265 ] 00:12:52.265 } 00:12:52.265 } 00:12:52.265 }' 00:12:52.265 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:52.265 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:52.265 BaseBdev2 00:12:52.265 BaseBdev3' 00:12:52.265 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:52.265 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:52.265 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:52.525 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:52.525 "name": "NewBaseBdev", 00:12:52.525 "aliases": [ 00:12:52.525 "5b08a532-105e-4c9c-9782-09de2e0bd7b5" 00:12:52.525 ], 00:12:52.525 "product_name": "Malloc disk", 00:12:52.525 "block_size": 512, 00:12:52.525 "num_blocks": 65536, 00:12:52.525 "uuid": "5b08a532-105e-4c9c-9782-09de2e0bd7b5", 00:12:52.525 "assigned_rate_limits": { 00:12:52.525 "rw_ios_per_sec": 0, 00:12:52.525 "rw_mbytes_per_sec": 0, 00:12:52.525 "r_mbytes_per_sec": 0, 00:12:52.525 "w_mbytes_per_sec": 0 00:12:52.525 }, 00:12:52.525 "claimed": true, 00:12:52.525 "claim_type": "exclusive_write", 00:12:52.525 "zoned": false, 00:12:52.525 "supported_io_types": { 00:12:52.525 "read": true, 00:12:52.525 "write": true, 00:12:52.525 "unmap": true, 00:12:52.525 "write_zeroes": true, 00:12:52.525 "flush": true, 00:12:52.525 "reset": true, 00:12:52.525 "compare": false, 00:12:52.525 "compare_and_write": false, 00:12:52.525 "abort": true, 00:12:52.525 "nvme_admin": false, 00:12:52.525 "nvme_io": false 00:12:52.525 }, 00:12:52.525 "memory_domains": [ 00:12:52.525 { 00:12:52.525 "dma_device_id": "system", 00:12:52.525 "dma_device_type": 1 00:12:52.525 }, 00:12:52.525 { 00:12:52.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.525 "dma_device_type": 2 00:12:52.525 } 00:12:52.525 ], 00:12:52.525 "driver_specific": {} 00:12:52.525 }' 00:12:52.525 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.525 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.525 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:52.525 13:41:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.785 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.785 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:52.785 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:52.786 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.046 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.046 "name": "BaseBdev2", 00:12:53.046 "aliases": [ 00:12:53.046 "54c5c148-6021-4ce9-9ed9-97ca2eeb7106" 00:12:53.046 ], 00:12:53.046 "product_name": "Malloc disk", 00:12:53.046 "block_size": 512, 00:12:53.046 "num_blocks": 65536, 00:12:53.046 "uuid": "54c5c148-6021-4ce9-9ed9-97ca2eeb7106", 00:12:53.046 "assigned_rate_limits": { 00:12:53.046 "rw_ios_per_sec": 0, 00:12:53.046 "rw_mbytes_per_sec": 0, 00:12:53.046 "r_mbytes_per_sec": 0, 00:12:53.046 "w_mbytes_per_sec": 0 00:12:53.046 }, 00:12:53.046 "claimed": true, 00:12:53.046 "claim_type": "exclusive_write", 00:12:53.046 "zoned": false, 00:12:53.046 "supported_io_types": { 00:12:53.046 "read": true, 00:12:53.046 "write": true, 00:12:53.046 "unmap": true, 00:12:53.046 "write_zeroes": true, 00:12:53.046 "flush": true, 00:12:53.046 "reset": true, 00:12:53.046 "compare": false, 00:12:53.046 "compare_and_write": false, 00:12:53.046 "abort": true, 00:12:53.046 "nvme_admin": false, 00:12:53.046 "nvme_io": false 00:12:53.046 }, 00:12:53.046 "memory_domains": [ 00:12:53.046 { 00:12:53.046 "dma_device_id": "system", 00:12:53.046 "dma_device_type": 1 00:12:53.046 }, 00:12:53.046 { 00:12:53.046 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.046 "dma_device_type": 2 00:12:53.046 } 00:12:53.046 ], 00:12:53.046 "driver_specific": {} 00:12:53.046 }' 00:12:53.046 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.046 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.306 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.566 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:53.566 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.566 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.566 "name": "BaseBdev3", 00:12:53.566 "aliases": [ 00:12:53.566 "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0" 00:12:53.566 ], 00:12:53.566 "product_name": "Malloc disk", 00:12:53.566 "block_size": 512, 00:12:53.566 "num_blocks": 65536, 00:12:53.566 "uuid": "ce0d81b1-2b4c-4668-abe1-997d1dfa0dd0", 00:12:53.566 "assigned_rate_limits": { 00:12:53.566 "rw_ios_per_sec": 0, 00:12:53.566 "rw_mbytes_per_sec": 0, 00:12:53.566 "r_mbytes_per_sec": 0, 00:12:53.566 "w_mbytes_per_sec": 0 00:12:53.566 }, 00:12:53.566 "claimed": true, 00:12:53.566 "claim_type": "exclusive_write", 00:12:53.566 "zoned": false, 00:12:53.566 "supported_io_types": { 00:12:53.566 "read": true, 00:12:53.566 "write": true, 00:12:53.566 "unmap": true, 00:12:53.566 "write_zeroes": true, 00:12:53.566 "flush": true, 00:12:53.566 "reset": true, 00:12:53.566 "compare": false, 00:12:53.566 "compare_and_write": false, 00:12:53.566 "abort": true, 00:12:53.566 "nvme_admin": false, 00:12:53.566 "nvme_io": false 00:12:53.566 }, 00:12:53.566 "memory_domains": [ 00:12:53.566 { 00:12:53.566 "dma_device_id": "system", 00:12:53.566 "dma_device_type": 1 00:12:53.566 }, 00:12:53.566 { 00:12:53.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.566 "dma_device_type": 2 00:12:53.566 } 00:12:53.566 ], 00:12:53.566 "driver_specific": {} 00:12:53.566 }' 00:12:53.566 13:41:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.566 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.826 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:54.086 [2024-06-10 13:41:08.518525] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:54.086 [2024-06-10 13:41:08.518539] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:54.086 [2024-06-10 13:41:08.518574] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.086 [2024-06-10 13:41:08.518607] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.086 [2024-06-10 13:41:08.518612] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b3090 name Existed_Raid, state offline 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1529060 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1529060 ']' 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1529060 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:12:54.086 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:54.087 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1529060 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1529060' 00:12:54.347 killing process with pid 1529060 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1529060 00:12:54.347 [2024-06-10 13:41:08.587963] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1529060 00:12:54.347 [2024-06-10 13:41:08.601598] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:54.347 00:12:54.347 real 0m24.885s 00:12:54.347 user 0m46.703s 00:12:54.347 sys 0m3.593s 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:54.347 13:41:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.347 ************************************ 00:12:54.347 END TEST raid_state_function_test 00:12:54.347 ************************************ 00:12:54.347 13:41:08 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:12:54.348 13:41:08 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:54.348 13:41:08 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:54.348 13:41:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:54.348 ************************************ 00:12:54.348 START TEST raid_state_function_test_sb 00:12:54.348 ************************************ 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 true 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1534379 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1534379' 00:12:54.348 Process raid pid: 1534379 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1534379 /var/tmp/spdk-raid.sock 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1534379 ']' 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:54.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:54.348 13:41:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.608 [2024-06-10 13:41:08.851309] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:12:54.608 [2024-06-10 13:41:08.851360] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:54.608 [2024-06-10 13:41:08.944123] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:54.608 [2024-06-10 13:41:09.039423] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:12:54.868 [2024-06-10 13:41:09.100537] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:54.868 [2024-06-10 13:41:09.100558] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.440 13:41:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:55.440 13:41:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:12:55.440 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:55.700 [2024-06-10 13:41:09.926476] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:55.700 [2024-06-10 13:41:09.926521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:55.701 [2024-06-10 13:41:09.926528] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:55.701 [2024-06-10 13:41:09.926535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:55.701 [2024-06-10 13:41:09.926541] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:55.701 [2024-06-10 13:41:09.926547] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.701 13:41:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:55.701 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.701 "name": "Existed_Raid", 00:12:55.701 "uuid": "c6db8369-c1ef-43d9-bf76-e87c62bb7aa4", 00:12:55.701 "strip_size_kb": 64, 00:12:55.701 "state": "configuring", 00:12:55.701 "raid_level": "concat", 00:12:55.701 "superblock": true, 00:12:55.701 "num_base_bdevs": 3, 00:12:55.701 "num_base_bdevs_discovered": 0, 00:12:55.701 "num_base_bdevs_operational": 3, 00:12:55.701 "base_bdevs_list": [ 00:12:55.701 { 00:12:55.701 "name": "BaseBdev1", 00:12:55.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:55.701 "is_configured": false, 00:12:55.701 "data_offset": 0, 00:12:55.701 "data_size": 0 00:12:55.701 }, 00:12:55.701 { 00:12:55.701 "name": "BaseBdev2", 00:12:55.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:55.701 "is_configured": false, 00:12:55.701 "data_offset": 0, 00:12:55.701 "data_size": 0 00:12:55.701 }, 00:12:55.701 { 00:12:55.701 "name": "BaseBdev3", 00:12:55.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:55.701 "is_configured": false, 00:12:55.701 "data_offset": 0, 00:12:55.701 "data_size": 0 00:12:55.701 } 00:12:55.701 ] 00:12:55.701 }' 00:12:55.701 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.701 13:41:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:56.275 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:56.535 [2024-06-10 13:41:10.936907] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:56.535 [2024-06-10 13:41:10.936945] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ac740 name Existed_Raid, state configuring 00:12:56.535 13:41:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:56.794 [2024-06-10 13:41:11.157490] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:56.794 [2024-06-10 13:41:11.157515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:56.794 [2024-06-10 13:41:11.157522] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:56.794 [2024-06-10 13:41:11.157529] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:56.794 [2024-06-10 13:41:11.157534] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:56.794 [2024-06-10 13:41:11.157541] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:56.794 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:57.054 [2024-06-10 13:41:11.388630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:57.054 BaseBdev1 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:57.054 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:57.315 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:57.575 [ 00:12:57.575 { 00:12:57.575 "name": "BaseBdev1", 00:12:57.575 "aliases": [ 00:12:57.575 "be6b30b9-44c5-4dc0-8c37-b4e9be27f146" 00:12:57.575 ], 00:12:57.575 "product_name": "Malloc disk", 00:12:57.575 "block_size": 512, 00:12:57.575 "num_blocks": 65536, 00:12:57.575 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:12:57.575 "assigned_rate_limits": { 00:12:57.575 "rw_ios_per_sec": 0, 00:12:57.575 "rw_mbytes_per_sec": 0, 00:12:57.575 "r_mbytes_per_sec": 0, 00:12:57.575 "w_mbytes_per_sec": 0 00:12:57.575 }, 00:12:57.575 "claimed": true, 00:12:57.575 "claim_type": "exclusive_write", 00:12:57.575 "zoned": false, 00:12:57.575 "supported_io_types": { 00:12:57.575 "read": true, 00:12:57.575 "write": true, 00:12:57.575 "unmap": true, 00:12:57.575 "write_zeroes": true, 00:12:57.575 "flush": true, 00:12:57.575 "reset": true, 00:12:57.575 "compare": false, 00:12:57.575 "compare_and_write": false, 00:12:57.575 "abort": true, 00:12:57.575 "nvme_admin": false, 00:12:57.575 "nvme_io": false 00:12:57.575 }, 00:12:57.575 "memory_domains": [ 00:12:57.575 { 00:12:57.575 "dma_device_id": "system", 00:12:57.575 "dma_device_type": 1 00:12:57.575 }, 00:12:57.575 { 00:12:57.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.575 "dma_device_type": 2 00:12:57.575 } 00:12:57.575 ], 00:12:57.575 "driver_specific": {} 00:12:57.575 } 00:12:57.575 ] 00:12:57.575 13:41:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:57.575 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.576 13:41:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.836 13:41:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.836 "name": "Existed_Raid", 00:12:57.836 "uuid": "21375305-cc2b-48f9-bbef-d1f061221048", 00:12:57.836 "strip_size_kb": 64, 00:12:57.836 "state": "configuring", 00:12:57.836 "raid_level": "concat", 00:12:57.836 "superblock": true, 00:12:57.836 "num_base_bdevs": 3, 00:12:57.836 "num_base_bdevs_discovered": 1, 00:12:57.836 "num_base_bdevs_operational": 3, 00:12:57.836 "base_bdevs_list": [ 00:12:57.836 { 00:12:57.836 "name": "BaseBdev1", 00:12:57.836 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:12:57.836 "is_configured": true, 00:12:57.836 "data_offset": 2048, 00:12:57.836 "data_size": 63488 00:12:57.836 }, 00:12:57.836 { 00:12:57.836 "name": "BaseBdev2", 00:12:57.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.836 "is_configured": false, 00:12:57.836 "data_offset": 0, 00:12:57.836 "data_size": 0 00:12:57.836 }, 00:12:57.836 { 00:12:57.836 "name": "BaseBdev3", 00:12:57.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.836 "is_configured": false, 00:12:57.836 "data_offset": 0, 00:12:57.836 "data_size": 0 00:12:57.836 } 00:12:57.836 ] 00:12:57.836 }' 00:12:57.836 13:41:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.836 13:41:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:58.408 13:41:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:58.408 [2024-06-10 13:41:12.828253] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:58.408 [2024-06-10 13:41:12.828280] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ac010 name Existed_Raid, state configuring 00:12:58.408 13:41:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:58.668 [2024-06-10 13:41:13.028804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:58.668 [2024-06-10 13:41:13.030023] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:58.668 [2024-06-10 13:41:13.030046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:58.668 [2024-06-10 13:41:13.030056] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:58.668 [2024-06-10 13:41:13.030062] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.668 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.928 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.928 "name": "Existed_Raid", 00:12:58.928 "uuid": "24cb3219-3d32-4439-8f5d-54c2913b055e", 00:12:58.928 "strip_size_kb": 64, 00:12:58.928 "state": "configuring", 00:12:58.928 "raid_level": "concat", 00:12:58.928 "superblock": true, 00:12:58.928 "num_base_bdevs": 3, 00:12:58.928 "num_base_bdevs_discovered": 1, 00:12:58.928 "num_base_bdevs_operational": 3, 00:12:58.928 "base_bdevs_list": [ 00:12:58.928 { 00:12:58.928 "name": "BaseBdev1", 00:12:58.928 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:12:58.928 "is_configured": true, 00:12:58.928 "data_offset": 2048, 00:12:58.928 "data_size": 63488 00:12:58.928 }, 00:12:58.928 { 00:12:58.928 "name": "BaseBdev2", 00:12:58.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.928 "is_configured": false, 00:12:58.928 "data_offset": 0, 00:12:58.928 "data_size": 0 00:12:58.928 }, 00:12:58.928 { 00:12:58.928 "name": "BaseBdev3", 00:12:58.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.928 "is_configured": false, 00:12:58.928 "data_offset": 0, 00:12:58.928 "data_size": 0 00:12:58.928 } 00:12:58.928 ] 00:12:58.928 }' 00:12:58.928 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.928 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.498 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:59.759 [2024-06-10 13:41:13.984308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:59.759 BaseBdev2 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:59.759 13:41:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:59.759 13:41:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:00.020 [ 00:13:00.020 { 00:13:00.020 "name": "BaseBdev2", 00:13:00.020 "aliases": [ 00:13:00.020 "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373" 00:13:00.020 ], 00:13:00.020 "product_name": "Malloc disk", 00:13:00.020 "block_size": 512, 00:13:00.020 "num_blocks": 65536, 00:13:00.020 "uuid": "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373", 00:13:00.020 "assigned_rate_limits": { 00:13:00.020 "rw_ios_per_sec": 0, 00:13:00.020 "rw_mbytes_per_sec": 0, 00:13:00.020 "r_mbytes_per_sec": 0, 00:13:00.020 "w_mbytes_per_sec": 0 00:13:00.020 }, 00:13:00.020 "claimed": true, 00:13:00.020 "claim_type": "exclusive_write", 00:13:00.020 "zoned": false, 00:13:00.020 "supported_io_types": { 00:13:00.020 "read": true, 00:13:00.020 "write": true, 00:13:00.020 "unmap": true, 00:13:00.020 "write_zeroes": true, 00:13:00.020 "flush": true, 00:13:00.020 "reset": true, 00:13:00.020 "compare": false, 00:13:00.020 "compare_and_write": false, 00:13:00.020 "abort": true, 00:13:00.020 "nvme_admin": false, 00:13:00.020 "nvme_io": false 00:13:00.020 }, 00:13:00.020 "memory_domains": [ 00:13:00.020 { 00:13:00.020 "dma_device_id": "system", 00:13:00.020 "dma_device_type": 1 00:13:00.020 }, 00:13:00.020 { 00:13:00.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.020 "dma_device_type": 2 00:13:00.020 } 00:13:00.020 ], 00:13:00.020 "driver_specific": {} 00:13:00.020 } 00:13:00.020 ] 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.020 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.281 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.281 "name": "Existed_Raid", 00:13:00.281 "uuid": "24cb3219-3d32-4439-8f5d-54c2913b055e", 00:13:00.281 "strip_size_kb": 64, 00:13:00.281 "state": "configuring", 00:13:00.281 "raid_level": "concat", 00:13:00.281 "superblock": true, 00:13:00.281 "num_base_bdevs": 3, 00:13:00.281 "num_base_bdevs_discovered": 2, 00:13:00.281 "num_base_bdevs_operational": 3, 00:13:00.281 "base_bdevs_list": [ 00:13:00.281 { 00:13:00.281 "name": "BaseBdev1", 00:13:00.281 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:13:00.281 "is_configured": true, 00:13:00.281 "data_offset": 2048, 00:13:00.281 "data_size": 63488 00:13:00.281 }, 00:13:00.281 { 00:13:00.281 "name": "BaseBdev2", 00:13:00.281 "uuid": "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373", 00:13:00.281 "is_configured": true, 00:13:00.281 "data_offset": 2048, 00:13:00.281 "data_size": 63488 00:13:00.281 }, 00:13:00.281 { 00:13:00.281 "name": "BaseBdev3", 00:13:00.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.281 "is_configured": false, 00:13:00.281 "data_offset": 0, 00:13:00.281 "data_size": 0 00:13:00.281 } 00:13:00.281 ] 00:13:00.281 }' 00:13:00.281 13:41:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.281 13:41:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:00.922 [2024-06-10 13:41:15.356892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:00.922 [2024-06-10 13:41:15.357006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13acf00 00:13:00.922 [2024-06-10 13:41:15.357014] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:00.922 [2024-06-10 13:41:15.357168] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c3df0 00:13:00.922 [2024-06-10 13:41:15.357265] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13acf00 00:13:00.922 [2024-06-10 13:41:15.357271] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13acf00 00:13:00.922 [2024-06-10 13:41:15.357346] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.922 BaseBdev3 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:00.922 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.184 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:01.446 [ 00:13:01.446 { 00:13:01.446 "name": "BaseBdev3", 00:13:01.446 "aliases": [ 00:13:01.446 "dfcfc885-afb9-49d7-8b23-45f4529d3f4d" 00:13:01.446 ], 00:13:01.446 "product_name": "Malloc disk", 00:13:01.446 "block_size": 512, 00:13:01.446 "num_blocks": 65536, 00:13:01.446 "uuid": "dfcfc885-afb9-49d7-8b23-45f4529d3f4d", 00:13:01.446 "assigned_rate_limits": { 00:13:01.446 "rw_ios_per_sec": 0, 00:13:01.446 "rw_mbytes_per_sec": 0, 00:13:01.446 "r_mbytes_per_sec": 0, 00:13:01.446 "w_mbytes_per_sec": 0 00:13:01.446 }, 00:13:01.446 "claimed": true, 00:13:01.446 "claim_type": "exclusive_write", 00:13:01.446 "zoned": false, 00:13:01.446 "supported_io_types": { 00:13:01.446 "read": true, 00:13:01.446 "write": true, 00:13:01.446 "unmap": true, 00:13:01.446 "write_zeroes": true, 00:13:01.446 "flush": true, 00:13:01.446 "reset": true, 00:13:01.446 "compare": false, 00:13:01.446 "compare_and_write": false, 00:13:01.446 "abort": true, 00:13:01.446 "nvme_admin": false, 00:13:01.446 "nvme_io": false 00:13:01.446 }, 00:13:01.446 "memory_domains": [ 00:13:01.446 { 00:13:01.446 "dma_device_id": "system", 00:13:01.446 "dma_device_type": 1 00:13:01.446 }, 00:13:01.446 { 00:13:01.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.446 "dma_device_type": 2 00:13:01.446 } 00:13:01.446 ], 00:13:01.446 "driver_specific": {} 00:13:01.446 } 00:13:01.446 ] 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.446 "name": "Existed_Raid", 00:13:01.446 "uuid": "24cb3219-3d32-4439-8f5d-54c2913b055e", 00:13:01.446 "strip_size_kb": 64, 00:13:01.446 "state": "online", 00:13:01.446 "raid_level": "concat", 00:13:01.446 "superblock": true, 00:13:01.446 "num_base_bdevs": 3, 00:13:01.446 "num_base_bdevs_discovered": 3, 00:13:01.446 "num_base_bdevs_operational": 3, 00:13:01.446 "base_bdevs_list": [ 00:13:01.446 { 00:13:01.446 "name": "BaseBdev1", 00:13:01.446 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:13:01.446 "is_configured": true, 00:13:01.446 "data_offset": 2048, 00:13:01.446 "data_size": 63488 00:13:01.446 }, 00:13:01.446 { 00:13:01.446 "name": "BaseBdev2", 00:13:01.446 "uuid": "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373", 00:13:01.446 "is_configured": true, 00:13:01.446 "data_offset": 2048, 00:13:01.446 "data_size": 63488 00:13:01.446 }, 00:13:01.446 { 00:13:01.446 "name": "BaseBdev3", 00:13:01.446 "uuid": "dfcfc885-afb9-49d7-8b23-45f4529d3f4d", 00:13:01.446 "is_configured": true, 00:13:01.446 "data_offset": 2048, 00:13:01.446 "data_size": 63488 00:13:01.446 } 00:13:01.446 ] 00:13:01.446 }' 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.446 13:41:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:02.018 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:02.279 [2024-06-10 13:41:16.652417] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:02.279 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:02.279 "name": "Existed_Raid", 00:13:02.279 "aliases": [ 00:13:02.279 "24cb3219-3d32-4439-8f5d-54c2913b055e" 00:13:02.279 ], 00:13:02.279 "product_name": "Raid Volume", 00:13:02.279 "block_size": 512, 00:13:02.279 "num_blocks": 190464, 00:13:02.279 "uuid": "24cb3219-3d32-4439-8f5d-54c2913b055e", 00:13:02.279 "assigned_rate_limits": { 00:13:02.279 "rw_ios_per_sec": 0, 00:13:02.279 "rw_mbytes_per_sec": 0, 00:13:02.279 "r_mbytes_per_sec": 0, 00:13:02.279 "w_mbytes_per_sec": 0 00:13:02.279 }, 00:13:02.279 "claimed": false, 00:13:02.279 "zoned": false, 00:13:02.279 "supported_io_types": { 00:13:02.279 "read": true, 00:13:02.279 "write": true, 00:13:02.279 "unmap": true, 00:13:02.279 "write_zeroes": true, 00:13:02.279 "flush": true, 00:13:02.279 "reset": true, 00:13:02.279 "compare": false, 00:13:02.279 "compare_and_write": false, 00:13:02.279 "abort": false, 00:13:02.279 "nvme_admin": false, 00:13:02.279 "nvme_io": false 00:13:02.279 }, 00:13:02.279 "memory_domains": [ 00:13:02.279 { 00:13:02.279 "dma_device_id": "system", 00:13:02.279 "dma_device_type": 1 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.279 "dma_device_type": 2 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "dma_device_id": "system", 00:13:02.279 "dma_device_type": 1 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.279 "dma_device_type": 2 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "dma_device_id": "system", 00:13:02.279 "dma_device_type": 1 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.279 "dma_device_type": 2 00:13:02.279 } 00:13:02.279 ], 00:13:02.279 "driver_specific": { 00:13:02.279 "raid": { 00:13:02.279 "uuid": "24cb3219-3d32-4439-8f5d-54c2913b055e", 00:13:02.279 "strip_size_kb": 64, 00:13:02.279 "state": "online", 00:13:02.279 "raid_level": "concat", 00:13:02.279 "superblock": true, 00:13:02.279 "num_base_bdevs": 3, 00:13:02.279 "num_base_bdevs_discovered": 3, 00:13:02.279 "num_base_bdevs_operational": 3, 00:13:02.279 "base_bdevs_list": [ 00:13:02.279 { 00:13:02.279 "name": "BaseBdev1", 00:13:02.279 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:13:02.279 "is_configured": true, 00:13:02.279 "data_offset": 2048, 00:13:02.279 "data_size": 63488 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "name": "BaseBdev2", 00:13:02.279 "uuid": "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373", 00:13:02.279 "is_configured": true, 00:13:02.279 "data_offset": 2048, 00:13:02.279 "data_size": 63488 00:13:02.279 }, 00:13:02.279 { 00:13:02.279 "name": "BaseBdev3", 00:13:02.279 "uuid": "dfcfc885-afb9-49d7-8b23-45f4529d3f4d", 00:13:02.279 "is_configured": true, 00:13:02.279 "data_offset": 2048, 00:13:02.279 "data_size": 63488 00:13:02.279 } 00:13:02.279 ] 00:13:02.279 } 00:13:02.279 } 00:13:02.280 }' 00:13:02.280 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:02.280 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:02.280 BaseBdev2 00:13:02.280 BaseBdev3' 00:13:02.280 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.280 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:02.280 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.540 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.540 "name": "BaseBdev1", 00:13:02.540 "aliases": [ 00:13:02.540 "be6b30b9-44c5-4dc0-8c37-b4e9be27f146" 00:13:02.540 ], 00:13:02.540 "product_name": "Malloc disk", 00:13:02.540 "block_size": 512, 00:13:02.540 "num_blocks": 65536, 00:13:02.540 "uuid": "be6b30b9-44c5-4dc0-8c37-b4e9be27f146", 00:13:02.540 "assigned_rate_limits": { 00:13:02.540 "rw_ios_per_sec": 0, 00:13:02.540 "rw_mbytes_per_sec": 0, 00:13:02.540 "r_mbytes_per_sec": 0, 00:13:02.540 "w_mbytes_per_sec": 0 00:13:02.540 }, 00:13:02.540 "claimed": true, 00:13:02.540 "claim_type": "exclusive_write", 00:13:02.540 "zoned": false, 00:13:02.540 "supported_io_types": { 00:13:02.540 "read": true, 00:13:02.540 "write": true, 00:13:02.540 "unmap": true, 00:13:02.540 "write_zeroes": true, 00:13:02.540 "flush": true, 00:13:02.540 "reset": true, 00:13:02.540 "compare": false, 00:13:02.540 "compare_and_write": false, 00:13:02.540 "abort": true, 00:13:02.540 "nvme_admin": false, 00:13:02.540 "nvme_io": false 00:13:02.540 }, 00:13:02.540 "memory_domains": [ 00:13:02.540 { 00:13:02.540 "dma_device_id": "system", 00:13:02.540 "dma_device_type": 1 00:13:02.540 }, 00:13:02.540 { 00:13:02.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.540 "dma_device_type": 2 00:13:02.540 } 00:13:02.540 ], 00:13:02.540 "driver_specific": {} 00:13:02.540 }' 00:13:02.540 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.540 13:41:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.540 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:02.540 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:02.802 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.063 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.063 "name": "BaseBdev2", 00:13:03.063 "aliases": [ 00:13:03.063 "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373" 00:13:03.063 ], 00:13:03.063 "product_name": "Malloc disk", 00:13:03.063 "block_size": 512, 00:13:03.063 "num_blocks": 65536, 00:13:03.063 "uuid": "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373", 00:13:03.063 "assigned_rate_limits": { 00:13:03.063 "rw_ios_per_sec": 0, 00:13:03.063 "rw_mbytes_per_sec": 0, 00:13:03.063 "r_mbytes_per_sec": 0, 00:13:03.063 "w_mbytes_per_sec": 0 00:13:03.063 }, 00:13:03.063 "claimed": true, 00:13:03.063 "claim_type": "exclusive_write", 00:13:03.063 "zoned": false, 00:13:03.063 "supported_io_types": { 00:13:03.063 "read": true, 00:13:03.063 "write": true, 00:13:03.063 "unmap": true, 00:13:03.063 "write_zeroes": true, 00:13:03.063 "flush": true, 00:13:03.063 "reset": true, 00:13:03.063 "compare": false, 00:13:03.063 "compare_and_write": false, 00:13:03.063 "abort": true, 00:13:03.063 "nvme_admin": false, 00:13:03.063 "nvme_io": false 00:13:03.063 }, 00:13:03.063 "memory_domains": [ 00:13:03.063 { 00:13:03.063 "dma_device_id": "system", 00:13:03.063 "dma_device_type": 1 00:13:03.063 }, 00:13:03.063 { 00:13:03.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.063 "dma_device_type": 2 00:13:03.063 } 00:13:03.063 ], 00:13:03.063 "driver_specific": {} 00:13:03.063 }' 00:13:03.063 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.063 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.324 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.325 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.325 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.586 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.586 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:03.586 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:03.586 13:41:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:03.586 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.586 "name": "BaseBdev3", 00:13:03.586 "aliases": [ 00:13:03.586 "dfcfc885-afb9-49d7-8b23-45f4529d3f4d" 00:13:03.586 ], 00:13:03.586 "product_name": "Malloc disk", 00:13:03.586 "block_size": 512, 00:13:03.586 "num_blocks": 65536, 00:13:03.586 "uuid": "dfcfc885-afb9-49d7-8b23-45f4529d3f4d", 00:13:03.586 "assigned_rate_limits": { 00:13:03.586 "rw_ios_per_sec": 0, 00:13:03.586 "rw_mbytes_per_sec": 0, 00:13:03.586 "r_mbytes_per_sec": 0, 00:13:03.586 "w_mbytes_per_sec": 0 00:13:03.586 }, 00:13:03.586 "claimed": true, 00:13:03.586 "claim_type": "exclusive_write", 00:13:03.586 "zoned": false, 00:13:03.586 "supported_io_types": { 00:13:03.586 "read": true, 00:13:03.586 "write": true, 00:13:03.586 "unmap": true, 00:13:03.586 "write_zeroes": true, 00:13:03.586 "flush": true, 00:13:03.586 "reset": true, 00:13:03.586 "compare": false, 00:13:03.586 "compare_and_write": false, 00:13:03.586 "abort": true, 00:13:03.586 "nvme_admin": false, 00:13:03.586 "nvme_io": false 00:13:03.586 }, 00:13:03.586 "memory_domains": [ 00:13:03.586 { 00:13:03.586 "dma_device_id": "system", 00:13:03.586 "dma_device_type": 1 00:13:03.586 }, 00:13:03.586 { 00:13:03.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.586 "dma_device_type": 2 00:13:03.586 } 00:13:03.586 ], 00:13:03.586 "driver_specific": {} 00:13:03.586 }' 00:13:03.586 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.847 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:04.107 [2024-06-10 13:41:18.545023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:04.107 [2024-06-10 13:41:18.545040] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:04.107 [2024-06-10 13:41:18.545072] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.107 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:04.368 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.368 "name": "Existed_Raid", 00:13:04.368 "uuid": "24cb3219-3d32-4439-8f5d-54c2913b055e", 00:13:04.368 "strip_size_kb": 64, 00:13:04.368 "state": "offline", 00:13:04.368 "raid_level": "concat", 00:13:04.368 "superblock": true, 00:13:04.368 "num_base_bdevs": 3, 00:13:04.368 "num_base_bdevs_discovered": 2, 00:13:04.368 "num_base_bdevs_operational": 2, 00:13:04.368 "base_bdevs_list": [ 00:13:04.368 { 00:13:04.368 "name": null, 00:13:04.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.368 "is_configured": false, 00:13:04.368 "data_offset": 2048, 00:13:04.368 "data_size": 63488 00:13:04.368 }, 00:13:04.368 { 00:13:04.368 "name": "BaseBdev2", 00:13:04.368 "uuid": "af1fcd5f-d6f5-4a98-8ae7-83df8ac9c373", 00:13:04.368 "is_configured": true, 00:13:04.368 "data_offset": 2048, 00:13:04.368 "data_size": 63488 00:13:04.368 }, 00:13:04.368 { 00:13:04.368 "name": "BaseBdev3", 00:13:04.368 "uuid": "dfcfc885-afb9-49d7-8b23-45f4529d3f4d", 00:13:04.368 "is_configured": true, 00:13:04.368 "data_offset": 2048, 00:13:04.368 "data_size": 63488 00:13:04.368 } 00:13:04.368 ] 00:13:04.368 }' 00:13:04.368 13:41:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.368 13:41:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:04.940 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:04.940 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:04.940 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.940 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:05.201 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:05.201 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:05.201 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:05.462 [2024-06-10 13:41:19.715983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:05.462 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:05.462 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:05.462 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.462 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:05.723 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:05.723 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:05.723 13:41:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:05.723 [2024-06-10 13:41:20.130931] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:05.723 [2024-06-10 13:41:20.130968] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13acf00 name Existed_Raid, state offline 00:13:05.723 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:05.723 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:05.723 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.723 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:05.984 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:05.984 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:05.984 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:05.984 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:05.984 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:05.984 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:06.246 BaseBdev2 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:06.246 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:06.507 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:06.507 [ 00:13:06.507 { 00:13:06.507 "name": "BaseBdev2", 00:13:06.507 "aliases": [ 00:13:06.507 "870dc945-ae76-4722-8c4b-2e44032e5000" 00:13:06.507 ], 00:13:06.507 "product_name": "Malloc disk", 00:13:06.507 "block_size": 512, 00:13:06.507 "num_blocks": 65536, 00:13:06.507 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:06.507 "assigned_rate_limits": { 00:13:06.507 "rw_ios_per_sec": 0, 00:13:06.507 "rw_mbytes_per_sec": 0, 00:13:06.507 "r_mbytes_per_sec": 0, 00:13:06.507 "w_mbytes_per_sec": 0 00:13:06.507 }, 00:13:06.507 "claimed": false, 00:13:06.507 "zoned": false, 00:13:06.507 "supported_io_types": { 00:13:06.507 "read": true, 00:13:06.507 "write": true, 00:13:06.507 "unmap": true, 00:13:06.507 "write_zeroes": true, 00:13:06.507 "flush": true, 00:13:06.507 "reset": true, 00:13:06.507 "compare": false, 00:13:06.507 "compare_and_write": false, 00:13:06.507 "abort": true, 00:13:06.508 "nvme_admin": false, 00:13:06.508 "nvme_io": false 00:13:06.508 }, 00:13:06.508 "memory_domains": [ 00:13:06.508 { 00:13:06.508 "dma_device_id": "system", 00:13:06.508 "dma_device_type": 1 00:13:06.508 }, 00:13:06.508 { 00:13:06.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.508 "dma_device_type": 2 00:13:06.508 } 00:13:06.508 ], 00:13:06.508 "driver_specific": {} 00:13:06.508 } 00:13:06.508 ] 00:13:06.508 13:41:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:06.508 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:06.508 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:06.508 13:41:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:06.769 BaseBdev3 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:06.769 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:07.030 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:07.290 [ 00:13:07.290 { 00:13:07.290 "name": "BaseBdev3", 00:13:07.290 "aliases": [ 00:13:07.290 "8771d8d1-0e94-4f1b-bbbc-38ede62edeed" 00:13:07.290 ], 00:13:07.290 "product_name": "Malloc disk", 00:13:07.290 "block_size": 512, 00:13:07.290 "num_blocks": 65536, 00:13:07.290 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:07.290 "assigned_rate_limits": { 00:13:07.290 "rw_ios_per_sec": 0, 00:13:07.290 "rw_mbytes_per_sec": 0, 00:13:07.290 "r_mbytes_per_sec": 0, 00:13:07.290 "w_mbytes_per_sec": 0 00:13:07.290 }, 00:13:07.290 "claimed": false, 00:13:07.290 "zoned": false, 00:13:07.290 "supported_io_types": { 00:13:07.290 "read": true, 00:13:07.290 "write": true, 00:13:07.290 "unmap": true, 00:13:07.290 "write_zeroes": true, 00:13:07.290 "flush": true, 00:13:07.290 "reset": true, 00:13:07.290 "compare": false, 00:13:07.290 "compare_and_write": false, 00:13:07.290 "abort": true, 00:13:07.290 "nvme_admin": false, 00:13:07.290 "nvme_io": false 00:13:07.290 }, 00:13:07.290 "memory_domains": [ 00:13:07.290 { 00:13:07.290 "dma_device_id": "system", 00:13:07.290 "dma_device_type": 1 00:13:07.290 }, 00:13:07.290 { 00:13:07.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.290 "dma_device_type": 2 00:13:07.290 } 00:13:07.290 ], 00:13:07.290 "driver_specific": {} 00:13:07.290 } 00:13:07.290 ] 00:13:07.290 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:07.290 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:07.290 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:07.291 [2024-06-10 13:41:21.727125] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:07.291 [2024-06-10 13:41:21.727158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:07.291 [2024-06-10 13:41:21.727175] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:07.291 [2024-06-10 13:41:21.728274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.291 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:07.551 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.551 "name": "Existed_Raid", 00:13:07.551 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:07.551 "strip_size_kb": 64, 00:13:07.551 "state": "configuring", 00:13:07.551 "raid_level": "concat", 00:13:07.551 "superblock": true, 00:13:07.551 "num_base_bdevs": 3, 00:13:07.551 "num_base_bdevs_discovered": 2, 00:13:07.551 "num_base_bdevs_operational": 3, 00:13:07.551 "base_bdevs_list": [ 00:13:07.551 { 00:13:07.551 "name": "BaseBdev1", 00:13:07.551 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.551 "is_configured": false, 00:13:07.551 "data_offset": 0, 00:13:07.551 "data_size": 0 00:13:07.551 }, 00:13:07.551 { 00:13:07.551 "name": "BaseBdev2", 00:13:07.551 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:07.551 "is_configured": true, 00:13:07.551 "data_offset": 2048, 00:13:07.551 "data_size": 63488 00:13:07.551 }, 00:13:07.551 { 00:13:07.551 "name": "BaseBdev3", 00:13:07.551 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:07.551 "is_configured": true, 00:13:07.551 "data_offset": 2048, 00:13:07.551 "data_size": 63488 00:13:07.551 } 00:13:07.551 ] 00:13:07.551 }' 00:13:07.551 13:41:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.551 13:41:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:08.131 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:08.392 [2024-06-10 13:41:22.689534] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.392 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.652 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.652 "name": "Existed_Raid", 00:13:08.652 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:08.652 "strip_size_kb": 64, 00:13:08.652 "state": "configuring", 00:13:08.652 "raid_level": "concat", 00:13:08.652 "superblock": true, 00:13:08.653 "num_base_bdevs": 3, 00:13:08.653 "num_base_bdevs_discovered": 1, 00:13:08.653 "num_base_bdevs_operational": 3, 00:13:08.653 "base_bdevs_list": [ 00:13:08.653 { 00:13:08.653 "name": "BaseBdev1", 00:13:08.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.653 "is_configured": false, 00:13:08.653 "data_offset": 0, 00:13:08.653 "data_size": 0 00:13:08.653 }, 00:13:08.653 { 00:13:08.653 "name": null, 00:13:08.653 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:08.653 "is_configured": false, 00:13:08.653 "data_offset": 2048, 00:13:08.653 "data_size": 63488 00:13:08.653 }, 00:13:08.653 { 00:13:08.653 "name": "BaseBdev3", 00:13:08.653 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:08.653 "is_configured": true, 00:13:08.653 "data_offset": 2048, 00:13:08.653 "data_size": 63488 00:13:08.653 } 00:13:08.653 ] 00:13:08.653 }' 00:13:08.653 13:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.653 13:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.224 13:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.224 13:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:09.224 13:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:09.224 13:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:09.484 [2024-06-10 13:41:23.873489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:09.484 BaseBdev1 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:09.484 13:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:09.744 13:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.004 [ 00:13:10.004 { 00:13:10.004 "name": "BaseBdev1", 00:13:10.004 "aliases": [ 00:13:10.004 "c1e186dc-a50a-4e78-a9f2-33887999361d" 00:13:10.004 ], 00:13:10.004 "product_name": "Malloc disk", 00:13:10.004 "block_size": 512, 00:13:10.004 "num_blocks": 65536, 00:13:10.004 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:10.004 "assigned_rate_limits": { 00:13:10.004 "rw_ios_per_sec": 0, 00:13:10.004 "rw_mbytes_per_sec": 0, 00:13:10.004 "r_mbytes_per_sec": 0, 00:13:10.004 "w_mbytes_per_sec": 0 00:13:10.004 }, 00:13:10.004 "claimed": true, 00:13:10.004 "claim_type": "exclusive_write", 00:13:10.004 "zoned": false, 00:13:10.004 "supported_io_types": { 00:13:10.004 "read": true, 00:13:10.004 "write": true, 00:13:10.004 "unmap": true, 00:13:10.004 "write_zeroes": true, 00:13:10.004 "flush": true, 00:13:10.004 "reset": true, 00:13:10.004 "compare": false, 00:13:10.004 "compare_and_write": false, 00:13:10.004 "abort": true, 00:13:10.004 "nvme_admin": false, 00:13:10.004 "nvme_io": false 00:13:10.004 }, 00:13:10.004 "memory_domains": [ 00:13:10.004 { 00:13:10.004 "dma_device_id": "system", 00:13:10.004 "dma_device_type": 1 00:13:10.004 }, 00:13:10.004 { 00:13:10.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.004 "dma_device_type": 2 00:13:10.004 } 00:13:10.004 ], 00:13:10.004 "driver_specific": {} 00:13:10.004 } 00:13:10.004 ] 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.004 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.264 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.264 "name": "Existed_Raid", 00:13:10.264 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:10.264 "strip_size_kb": 64, 00:13:10.264 "state": "configuring", 00:13:10.264 "raid_level": "concat", 00:13:10.264 "superblock": true, 00:13:10.264 "num_base_bdevs": 3, 00:13:10.264 "num_base_bdevs_discovered": 2, 00:13:10.264 "num_base_bdevs_operational": 3, 00:13:10.264 "base_bdevs_list": [ 00:13:10.264 { 00:13:10.264 "name": "BaseBdev1", 00:13:10.264 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:10.264 "is_configured": true, 00:13:10.264 "data_offset": 2048, 00:13:10.264 "data_size": 63488 00:13:10.264 }, 00:13:10.264 { 00:13:10.264 "name": null, 00:13:10.264 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:10.264 "is_configured": false, 00:13:10.264 "data_offset": 2048, 00:13:10.264 "data_size": 63488 00:13:10.264 }, 00:13:10.264 { 00:13:10.264 "name": "BaseBdev3", 00:13:10.264 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:10.264 "is_configured": true, 00:13:10.264 "data_offset": 2048, 00:13:10.264 "data_size": 63488 00:13:10.264 } 00:13:10.264 ] 00:13:10.264 }' 00:13:10.264 13:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.264 13:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:10.833 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.833 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:10.833 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:10.833 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:11.092 [2024-06-10 13:41:25.345244] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.092 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.352 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.352 "name": "Existed_Raid", 00:13:11.352 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:11.352 "strip_size_kb": 64, 00:13:11.352 "state": "configuring", 00:13:11.352 "raid_level": "concat", 00:13:11.352 "superblock": true, 00:13:11.352 "num_base_bdevs": 3, 00:13:11.352 "num_base_bdevs_discovered": 1, 00:13:11.352 "num_base_bdevs_operational": 3, 00:13:11.352 "base_bdevs_list": [ 00:13:11.352 { 00:13:11.352 "name": "BaseBdev1", 00:13:11.352 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:11.352 "is_configured": true, 00:13:11.352 "data_offset": 2048, 00:13:11.352 "data_size": 63488 00:13:11.352 }, 00:13:11.352 { 00:13:11.352 "name": null, 00:13:11.352 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:11.352 "is_configured": false, 00:13:11.352 "data_offset": 2048, 00:13:11.352 "data_size": 63488 00:13:11.352 }, 00:13:11.352 { 00:13:11.352 "name": null, 00:13:11.352 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:11.352 "is_configured": false, 00:13:11.352 "data_offset": 2048, 00:13:11.352 "data_size": 63488 00:13:11.352 } 00:13:11.352 ] 00:13:11.352 }' 00:13:11.352 13:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.352 13:41:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.922 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.922 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:11.922 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:11.922 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:12.182 [2024-06-10 13:41:26.504198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.182 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.442 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.442 "name": "Existed_Raid", 00:13:12.442 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:12.442 "strip_size_kb": 64, 00:13:12.442 "state": "configuring", 00:13:12.442 "raid_level": "concat", 00:13:12.442 "superblock": true, 00:13:12.442 "num_base_bdevs": 3, 00:13:12.442 "num_base_bdevs_discovered": 2, 00:13:12.442 "num_base_bdevs_operational": 3, 00:13:12.442 "base_bdevs_list": [ 00:13:12.442 { 00:13:12.442 "name": "BaseBdev1", 00:13:12.442 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:12.442 "is_configured": true, 00:13:12.442 "data_offset": 2048, 00:13:12.442 "data_size": 63488 00:13:12.442 }, 00:13:12.442 { 00:13:12.442 "name": null, 00:13:12.442 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:12.442 "is_configured": false, 00:13:12.442 "data_offset": 2048, 00:13:12.442 "data_size": 63488 00:13:12.442 }, 00:13:12.442 { 00:13:12.442 "name": "BaseBdev3", 00:13:12.442 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:12.442 "is_configured": true, 00:13:12.442 "data_offset": 2048, 00:13:12.442 "data_size": 63488 00:13:12.442 } 00:13:12.442 ] 00:13:12.442 }' 00:13:12.442 13:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.442 13:41:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:13.013 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.013 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:13.272 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:13.273 [2024-06-10 13:41:27.679193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.273 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.533 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.533 "name": "Existed_Raid", 00:13:13.533 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:13.533 "strip_size_kb": 64, 00:13:13.533 "state": "configuring", 00:13:13.533 "raid_level": "concat", 00:13:13.533 "superblock": true, 00:13:13.533 "num_base_bdevs": 3, 00:13:13.533 "num_base_bdevs_discovered": 1, 00:13:13.533 "num_base_bdevs_operational": 3, 00:13:13.533 "base_bdevs_list": [ 00:13:13.533 { 00:13:13.533 "name": null, 00:13:13.533 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:13.533 "is_configured": false, 00:13:13.533 "data_offset": 2048, 00:13:13.533 "data_size": 63488 00:13:13.533 }, 00:13:13.533 { 00:13:13.533 "name": null, 00:13:13.533 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:13.533 "is_configured": false, 00:13:13.533 "data_offset": 2048, 00:13:13.533 "data_size": 63488 00:13:13.533 }, 00:13:13.533 { 00:13:13.533 "name": "BaseBdev3", 00:13:13.533 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:13.533 "is_configured": true, 00:13:13.533 "data_offset": 2048, 00:13:13.533 "data_size": 63488 00:13:13.533 } 00:13:13.533 ] 00:13:13.533 }' 00:13:13.533 13:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.533 13:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:14.101 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.101 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:14.361 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:14.361 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:14.621 [2024-06-10 13:41:28.860006] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.621 13:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.882 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.882 "name": "Existed_Raid", 00:13:14.882 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:14.882 "strip_size_kb": 64, 00:13:14.882 "state": "configuring", 00:13:14.882 "raid_level": "concat", 00:13:14.882 "superblock": true, 00:13:14.882 "num_base_bdevs": 3, 00:13:14.882 "num_base_bdevs_discovered": 2, 00:13:14.882 "num_base_bdevs_operational": 3, 00:13:14.882 "base_bdevs_list": [ 00:13:14.882 { 00:13:14.882 "name": null, 00:13:14.882 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:14.882 "is_configured": false, 00:13:14.882 "data_offset": 2048, 00:13:14.882 "data_size": 63488 00:13:14.882 }, 00:13:14.882 { 00:13:14.882 "name": "BaseBdev2", 00:13:14.882 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:14.882 "is_configured": true, 00:13:14.882 "data_offset": 2048, 00:13:14.882 "data_size": 63488 00:13:14.882 }, 00:13:14.882 { 00:13:14.882 "name": "BaseBdev3", 00:13:14.882 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:14.882 "is_configured": true, 00:13:14.882 "data_offset": 2048, 00:13:14.882 "data_size": 63488 00:13:14.882 } 00:13:14.882 ] 00:13:14.882 }' 00:13:14.882 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.882 13:41:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:15.453 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.453 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:15.453 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:15.453 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.453 13:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:15.714 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c1e186dc-a50a-4e78-a9f2-33887999361d 00:13:15.974 [2024-06-10 13:41:30.256517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:15.974 [2024-06-10 13:41:30.256629] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13ae090 00:13:15.974 [2024-06-10 13:41:30.256637] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:15.974 [2024-06-10 13:41:30.256785] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x155acc0 00:13:15.974 [2024-06-10 13:41:30.256874] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13ae090 00:13:15.974 [2024-06-10 13:41:30.256880] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13ae090 00:13:15.974 [2024-06-10 13:41:30.256953] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:15.974 NewBaseBdev 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:15.974 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.234 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:16.234 [ 00:13:16.234 { 00:13:16.234 "name": "NewBaseBdev", 00:13:16.234 "aliases": [ 00:13:16.234 "c1e186dc-a50a-4e78-a9f2-33887999361d" 00:13:16.234 ], 00:13:16.234 "product_name": "Malloc disk", 00:13:16.234 "block_size": 512, 00:13:16.234 "num_blocks": 65536, 00:13:16.234 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:16.234 "assigned_rate_limits": { 00:13:16.234 "rw_ios_per_sec": 0, 00:13:16.234 "rw_mbytes_per_sec": 0, 00:13:16.234 "r_mbytes_per_sec": 0, 00:13:16.234 "w_mbytes_per_sec": 0 00:13:16.234 }, 00:13:16.234 "claimed": true, 00:13:16.234 "claim_type": "exclusive_write", 00:13:16.234 "zoned": false, 00:13:16.234 "supported_io_types": { 00:13:16.234 "read": true, 00:13:16.234 "write": true, 00:13:16.234 "unmap": true, 00:13:16.235 "write_zeroes": true, 00:13:16.235 "flush": true, 00:13:16.235 "reset": true, 00:13:16.235 "compare": false, 00:13:16.235 "compare_and_write": false, 00:13:16.235 "abort": true, 00:13:16.235 "nvme_admin": false, 00:13:16.235 "nvme_io": false 00:13:16.235 }, 00:13:16.235 "memory_domains": [ 00:13:16.235 { 00:13:16.235 "dma_device_id": "system", 00:13:16.235 "dma_device_type": 1 00:13:16.235 }, 00:13:16.235 { 00:13:16.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.235 "dma_device_type": 2 00:13:16.235 } 00:13:16.235 ], 00:13:16.235 "driver_specific": {} 00:13:16.235 } 00:13:16.235 ] 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.235 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.495 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.495 "name": "Existed_Raid", 00:13:16.495 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:16.495 "strip_size_kb": 64, 00:13:16.495 "state": "online", 00:13:16.495 "raid_level": "concat", 00:13:16.495 "superblock": true, 00:13:16.495 "num_base_bdevs": 3, 00:13:16.495 "num_base_bdevs_discovered": 3, 00:13:16.495 "num_base_bdevs_operational": 3, 00:13:16.495 "base_bdevs_list": [ 00:13:16.495 { 00:13:16.495 "name": "NewBaseBdev", 00:13:16.495 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:16.495 "is_configured": true, 00:13:16.495 "data_offset": 2048, 00:13:16.495 "data_size": 63488 00:13:16.495 }, 00:13:16.495 { 00:13:16.495 "name": "BaseBdev2", 00:13:16.495 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:16.495 "is_configured": true, 00:13:16.495 "data_offset": 2048, 00:13:16.495 "data_size": 63488 00:13:16.495 }, 00:13:16.495 { 00:13:16.495 "name": "BaseBdev3", 00:13:16.495 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:16.495 "is_configured": true, 00:13:16.495 "data_offset": 2048, 00:13:16.495 "data_size": 63488 00:13:16.495 } 00:13:16.495 ] 00:13:16.495 }' 00:13:16.495 13:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.495 13:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:17.066 [2024-06-10 13:41:31.491894] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:17.066 "name": "Existed_Raid", 00:13:17.066 "aliases": [ 00:13:17.066 "bb0477fc-30a5-4129-b174-9df31633cdf2" 00:13:17.066 ], 00:13:17.066 "product_name": "Raid Volume", 00:13:17.066 "block_size": 512, 00:13:17.066 "num_blocks": 190464, 00:13:17.066 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:17.066 "assigned_rate_limits": { 00:13:17.066 "rw_ios_per_sec": 0, 00:13:17.066 "rw_mbytes_per_sec": 0, 00:13:17.066 "r_mbytes_per_sec": 0, 00:13:17.066 "w_mbytes_per_sec": 0 00:13:17.066 }, 00:13:17.066 "claimed": false, 00:13:17.066 "zoned": false, 00:13:17.066 "supported_io_types": { 00:13:17.066 "read": true, 00:13:17.066 "write": true, 00:13:17.066 "unmap": true, 00:13:17.066 "write_zeroes": true, 00:13:17.066 "flush": true, 00:13:17.066 "reset": true, 00:13:17.066 "compare": false, 00:13:17.066 "compare_and_write": false, 00:13:17.066 "abort": false, 00:13:17.066 "nvme_admin": false, 00:13:17.066 "nvme_io": false 00:13:17.066 }, 00:13:17.066 "memory_domains": [ 00:13:17.066 { 00:13:17.066 "dma_device_id": "system", 00:13:17.066 "dma_device_type": 1 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.066 "dma_device_type": 2 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "dma_device_id": "system", 00:13:17.066 "dma_device_type": 1 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.066 "dma_device_type": 2 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "dma_device_id": "system", 00:13:17.066 "dma_device_type": 1 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.066 "dma_device_type": 2 00:13:17.066 } 00:13:17.066 ], 00:13:17.066 "driver_specific": { 00:13:17.066 "raid": { 00:13:17.066 "uuid": "bb0477fc-30a5-4129-b174-9df31633cdf2", 00:13:17.066 "strip_size_kb": 64, 00:13:17.066 "state": "online", 00:13:17.066 "raid_level": "concat", 00:13:17.066 "superblock": true, 00:13:17.066 "num_base_bdevs": 3, 00:13:17.066 "num_base_bdevs_discovered": 3, 00:13:17.066 "num_base_bdevs_operational": 3, 00:13:17.066 "base_bdevs_list": [ 00:13:17.066 { 00:13:17.066 "name": "NewBaseBdev", 00:13:17.066 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:17.066 "is_configured": true, 00:13:17.066 "data_offset": 2048, 00:13:17.066 "data_size": 63488 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "name": "BaseBdev2", 00:13:17.066 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:17.066 "is_configured": true, 00:13:17.066 "data_offset": 2048, 00:13:17.066 "data_size": 63488 00:13:17.066 }, 00:13:17.066 { 00:13:17.066 "name": "BaseBdev3", 00:13:17.066 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:17.066 "is_configured": true, 00:13:17.066 "data_offset": 2048, 00:13:17.066 "data_size": 63488 00:13:17.066 } 00:13:17.066 ] 00:13:17.066 } 00:13:17.066 } 00:13:17.066 }' 00:13:17.066 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:17.326 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:17.326 BaseBdev2 00:13:17.326 BaseBdev3' 00:13:17.326 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.326 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:17.326 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:17.326 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:17.326 "name": "NewBaseBdev", 00:13:17.326 "aliases": [ 00:13:17.326 "c1e186dc-a50a-4e78-a9f2-33887999361d" 00:13:17.326 ], 00:13:17.326 "product_name": "Malloc disk", 00:13:17.326 "block_size": 512, 00:13:17.326 "num_blocks": 65536, 00:13:17.326 "uuid": "c1e186dc-a50a-4e78-a9f2-33887999361d", 00:13:17.326 "assigned_rate_limits": { 00:13:17.326 "rw_ios_per_sec": 0, 00:13:17.326 "rw_mbytes_per_sec": 0, 00:13:17.326 "r_mbytes_per_sec": 0, 00:13:17.326 "w_mbytes_per_sec": 0 00:13:17.326 }, 00:13:17.326 "claimed": true, 00:13:17.326 "claim_type": "exclusive_write", 00:13:17.326 "zoned": false, 00:13:17.326 "supported_io_types": { 00:13:17.326 "read": true, 00:13:17.326 "write": true, 00:13:17.326 "unmap": true, 00:13:17.326 "write_zeroes": true, 00:13:17.326 "flush": true, 00:13:17.326 "reset": true, 00:13:17.326 "compare": false, 00:13:17.326 "compare_and_write": false, 00:13:17.326 "abort": true, 00:13:17.326 "nvme_admin": false, 00:13:17.326 "nvme_io": false 00:13:17.326 }, 00:13:17.326 "memory_domains": [ 00:13:17.326 { 00:13:17.326 "dma_device_id": "system", 00:13:17.326 "dma_device_type": 1 00:13:17.326 }, 00:13:17.326 { 00:13:17.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.326 "dma_device_type": 2 00:13:17.326 } 00:13:17.326 ], 00:13:17.326 "driver_specific": {} 00:13:17.326 }' 00:13:17.326 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.586 13:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.586 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:17.586 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.845 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.845 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.845 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.845 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:17.846 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.105 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.105 "name": "BaseBdev2", 00:13:18.105 "aliases": [ 00:13:18.105 "870dc945-ae76-4722-8c4b-2e44032e5000" 00:13:18.106 ], 00:13:18.106 "product_name": "Malloc disk", 00:13:18.106 "block_size": 512, 00:13:18.106 "num_blocks": 65536, 00:13:18.106 "uuid": "870dc945-ae76-4722-8c4b-2e44032e5000", 00:13:18.106 "assigned_rate_limits": { 00:13:18.106 "rw_ios_per_sec": 0, 00:13:18.106 "rw_mbytes_per_sec": 0, 00:13:18.106 "r_mbytes_per_sec": 0, 00:13:18.106 "w_mbytes_per_sec": 0 00:13:18.106 }, 00:13:18.106 "claimed": true, 00:13:18.106 "claim_type": "exclusive_write", 00:13:18.106 "zoned": false, 00:13:18.106 "supported_io_types": { 00:13:18.106 "read": true, 00:13:18.106 "write": true, 00:13:18.106 "unmap": true, 00:13:18.106 "write_zeroes": true, 00:13:18.106 "flush": true, 00:13:18.106 "reset": true, 00:13:18.106 "compare": false, 00:13:18.106 "compare_and_write": false, 00:13:18.106 "abort": true, 00:13:18.106 "nvme_admin": false, 00:13:18.106 "nvme_io": false 00:13:18.106 }, 00:13:18.106 "memory_domains": [ 00:13:18.106 { 00:13:18.106 "dma_device_id": "system", 00:13:18.106 "dma_device_type": 1 00:13:18.106 }, 00:13:18.106 { 00:13:18.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.106 "dma_device_type": 2 00:13:18.106 } 00:13:18.106 ], 00:13:18.106 "driver_specific": {} 00:13:18.106 }' 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.106 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.366 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.366 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.366 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.366 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:18.366 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.626 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.626 "name": "BaseBdev3", 00:13:18.626 "aliases": [ 00:13:18.626 "8771d8d1-0e94-4f1b-bbbc-38ede62edeed" 00:13:18.626 ], 00:13:18.626 "product_name": "Malloc disk", 00:13:18.626 "block_size": 512, 00:13:18.626 "num_blocks": 65536, 00:13:18.626 "uuid": "8771d8d1-0e94-4f1b-bbbc-38ede62edeed", 00:13:18.626 "assigned_rate_limits": { 00:13:18.626 "rw_ios_per_sec": 0, 00:13:18.626 "rw_mbytes_per_sec": 0, 00:13:18.626 "r_mbytes_per_sec": 0, 00:13:18.626 "w_mbytes_per_sec": 0 00:13:18.626 }, 00:13:18.626 "claimed": true, 00:13:18.626 "claim_type": "exclusive_write", 00:13:18.626 "zoned": false, 00:13:18.626 "supported_io_types": { 00:13:18.626 "read": true, 00:13:18.626 "write": true, 00:13:18.626 "unmap": true, 00:13:18.626 "write_zeroes": true, 00:13:18.626 "flush": true, 00:13:18.626 "reset": true, 00:13:18.626 "compare": false, 00:13:18.626 "compare_and_write": false, 00:13:18.626 "abort": true, 00:13:18.626 "nvme_admin": false, 00:13:18.626 "nvme_io": false 00:13:18.626 }, 00:13:18.626 "memory_domains": [ 00:13:18.626 { 00:13:18.626 "dma_device_id": "system", 00:13:18.626 "dma_device_type": 1 00:13:18.626 }, 00:13:18.626 { 00:13:18.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.626 "dma_device_type": 2 00:13:18.626 } 00:13:18.626 ], 00:13:18.626 "driver_specific": {} 00:13:18.626 }' 00:13:18.626 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.626 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.626 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.626 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.626 13:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.626 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.626 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.626 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.626 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.626 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.886 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.886 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.886 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:18.886 [2024-06-10 13:41:33.348402] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:18.886 [2024-06-10 13:41:33.348418] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:18.886 [2024-06-10 13:41:33.348457] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:18.886 [2024-06-10 13:41:33.348495] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:18.886 [2024-06-10 13:41:33.348501] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ae090 name Existed_Raid, state offline 00:13:19.146 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1534379 00:13:19.146 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1534379 ']' 00:13:19.146 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1534379 00:13:19.146 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:13:19.146 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:19.146 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1534379 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1534379' 00:13:19.147 killing process with pid 1534379 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1534379 00:13:19.147 [2024-06-10 13:41:33.400209] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1534379 00:13:19.147 [2024-06-10 13:41:33.415612] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:19.147 00:13:19.147 real 0m24.751s 00:13:19.147 user 0m46.234s 00:13:19.147 sys 0m3.757s 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:19.147 13:41:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.147 ************************************ 00:13:19.147 END TEST raid_state_function_test_sb 00:13:19.147 ************************************ 00:13:19.147 13:41:33 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:13:19.147 13:41:33 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:13:19.147 13:41:33 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:19.147 13:41:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:19.147 ************************************ 00:13:19.147 START TEST raid_superblock_test 00:13:19.147 ************************************ 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 3 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1539911 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1539911 /var/tmp/spdk-raid.sock 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1539911 ']' 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:19.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:19.147 13:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.407 [2024-06-10 13:41:33.671175] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:13:19.407 [2024-06-10 13:41:33.671223] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1539911 ] 00:13:19.407 [2024-06-10 13:41:33.759346] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:19.407 [2024-06-10 13:41:33.824341] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:13:19.407 [2024-06-10 13:41:33.874823] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:19.407 [2024-06-10 13:41:33.874846] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:20.348 malloc1 00:13:20.348 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:20.609 [2024-06-10 13:41:34.918067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:20.609 [2024-06-10 13:41:34.918101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:20.609 [2024-06-10 13:41:34.918114] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233d550 00:13:20.609 [2024-06-10 13:41:34.918121] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:20.609 [2024-06-10 13:41:34.919458] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:20.609 [2024-06-10 13:41:34.919479] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:20.609 pt1 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:20.609 13:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:20.870 malloc2 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:20.870 [2024-06-10 13:41:35.325208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:20.870 [2024-06-10 13:41:35.325237] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:20.870 [2024-06-10 13:41:35.325247] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ff0f0 00:13:20.870 [2024-06-10 13:41:35.325254] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:20.870 [2024-06-10 13:41:35.326494] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:20.870 [2024-06-10 13:41:35.326514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:20.870 pt2 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:20.870 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:21.130 malloc3 00:13:21.130 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:21.390 [2024-06-10 13:41:35.732255] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:21.390 [2024-06-10 13:41:35.732283] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:21.390 [2024-06-10 13:41:35.732297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24005b0 00:13:21.390 [2024-06-10 13:41:35.732304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:21.390 [2024-06-10 13:41:35.733556] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:21.390 [2024-06-10 13:41:35.733574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:21.390 pt3 00:13:21.390 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:21.390 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:21.390 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:21.650 [2024-06-10 13:41:35.936789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:21.650 [2024-06-10 13:41:35.937836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:21.650 [2024-06-10 13:41:35.937881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:21.650 [2024-06-10 13:41:35.938004] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2335f30 00:13:21.650 [2024-06-10 13:41:35.938012] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:21.650 [2024-06-10 13:41:35.938172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x233f500 00:13:21.650 [2024-06-10 13:41:35.938284] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2335f30 00:13:21.650 [2024-06-10 13:41:35.938290] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2335f30 00:13:21.650 [2024-06-10 13:41:35.938365] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.650 13:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:21.910 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.910 "name": "raid_bdev1", 00:13:21.910 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:21.910 "strip_size_kb": 64, 00:13:21.910 "state": "online", 00:13:21.910 "raid_level": "concat", 00:13:21.910 "superblock": true, 00:13:21.910 "num_base_bdevs": 3, 00:13:21.910 "num_base_bdevs_discovered": 3, 00:13:21.910 "num_base_bdevs_operational": 3, 00:13:21.910 "base_bdevs_list": [ 00:13:21.910 { 00:13:21.910 "name": "pt1", 00:13:21.910 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:21.910 "is_configured": true, 00:13:21.910 "data_offset": 2048, 00:13:21.910 "data_size": 63488 00:13:21.910 }, 00:13:21.910 { 00:13:21.910 "name": "pt2", 00:13:21.910 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:21.910 "is_configured": true, 00:13:21.910 "data_offset": 2048, 00:13:21.910 "data_size": 63488 00:13:21.910 }, 00:13:21.910 { 00:13:21.910 "name": "pt3", 00:13:21.910 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:21.910 "is_configured": true, 00:13:21.910 "data_offset": 2048, 00:13:21.910 "data_size": 63488 00:13:21.910 } 00:13:21.910 ] 00:13:21.910 }' 00:13:21.910 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.910 13:41:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:22.480 [2024-06-10 13:41:36.935509] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:22.480 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:22.480 "name": "raid_bdev1", 00:13:22.480 "aliases": [ 00:13:22.480 "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c" 00:13:22.480 ], 00:13:22.480 "product_name": "Raid Volume", 00:13:22.480 "block_size": 512, 00:13:22.480 "num_blocks": 190464, 00:13:22.480 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:22.480 "assigned_rate_limits": { 00:13:22.480 "rw_ios_per_sec": 0, 00:13:22.480 "rw_mbytes_per_sec": 0, 00:13:22.480 "r_mbytes_per_sec": 0, 00:13:22.480 "w_mbytes_per_sec": 0 00:13:22.480 }, 00:13:22.480 "claimed": false, 00:13:22.480 "zoned": false, 00:13:22.480 "supported_io_types": { 00:13:22.480 "read": true, 00:13:22.480 "write": true, 00:13:22.480 "unmap": true, 00:13:22.480 "write_zeroes": true, 00:13:22.480 "flush": true, 00:13:22.480 "reset": true, 00:13:22.480 "compare": false, 00:13:22.480 "compare_and_write": false, 00:13:22.480 "abort": false, 00:13:22.480 "nvme_admin": false, 00:13:22.480 "nvme_io": false 00:13:22.480 }, 00:13:22.480 "memory_domains": [ 00:13:22.480 { 00:13:22.480 "dma_device_id": "system", 00:13:22.480 "dma_device_type": 1 00:13:22.480 }, 00:13:22.480 { 00:13:22.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.480 "dma_device_type": 2 00:13:22.480 }, 00:13:22.480 { 00:13:22.480 "dma_device_id": "system", 00:13:22.480 "dma_device_type": 1 00:13:22.480 }, 00:13:22.480 { 00:13:22.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.480 "dma_device_type": 2 00:13:22.480 }, 00:13:22.480 { 00:13:22.480 "dma_device_id": "system", 00:13:22.480 "dma_device_type": 1 00:13:22.480 }, 00:13:22.480 { 00:13:22.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.481 "dma_device_type": 2 00:13:22.481 } 00:13:22.481 ], 00:13:22.481 "driver_specific": { 00:13:22.481 "raid": { 00:13:22.481 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:22.481 "strip_size_kb": 64, 00:13:22.481 "state": "online", 00:13:22.481 "raid_level": "concat", 00:13:22.481 "superblock": true, 00:13:22.481 "num_base_bdevs": 3, 00:13:22.481 "num_base_bdevs_discovered": 3, 00:13:22.481 "num_base_bdevs_operational": 3, 00:13:22.481 "base_bdevs_list": [ 00:13:22.481 { 00:13:22.481 "name": "pt1", 00:13:22.481 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:22.481 "is_configured": true, 00:13:22.481 "data_offset": 2048, 00:13:22.481 "data_size": 63488 00:13:22.481 }, 00:13:22.481 { 00:13:22.481 "name": "pt2", 00:13:22.481 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:22.481 "is_configured": true, 00:13:22.481 "data_offset": 2048, 00:13:22.481 "data_size": 63488 00:13:22.481 }, 00:13:22.481 { 00:13:22.481 "name": "pt3", 00:13:22.481 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:22.481 "is_configured": true, 00:13:22.481 "data_offset": 2048, 00:13:22.481 "data_size": 63488 00:13:22.481 } 00:13:22.481 ] 00:13:22.481 } 00:13:22.481 } 00:13:22.481 }' 00:13:22.481 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:22.740 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:22.740 pt2 00:13:22.740 pt3' 00:13:22.740 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:22.740 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:22.740 13:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:22.740 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:22.740 "name": "pt1", 00:13:22.740 "aliases": [ 00:13:22.740 "00000000-0000-0000-0000-000000000001" 00:13:22.740 ], 00:13:22.740 "product_name": "passthru", 00:13:22.740 "block_size": 512, 00:13:22.740 "num_blocks": 65536, 00:13:22.740 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:22.741 "assigned_rate_limits": { 00:13:22.741 "rw_ios_per_sec": 0, 00:13:22.741 "rw_mbytes_per_sec": 0, 00:13:22.741 "r_mbytes_per_sec": 0, 00:13:22.741 "w_mbytes_per_sec": 0 00:13:22.741 }, 00:13:22.741 "claimed": true, 00:13:22.741 "claim_type": "exclusive_write", 00:13:22.741 "zoned": false, 00:13:22.741 "supported_io_types": { 00:13:22.741 "read": true, 00:13:22.741 "write": true, 00:13:22.741 "unmap": true, 00:13:22.741 "write_zeroes": true, 00:13:22.741 "flush": true, 00:13:22.741 "reset": true, 00:13:22.741 "compare": false, 00:13:22.741 "compare_and_write": false, 00:13:22.741 "abort": true, 00:13:22.741 "nvme_admin": false, 00:13:22.741 "nvme_io": false 00:13:22.741 }, 00:13:22.741 "memory_domains": [ 00:13:22.741 { 00:13:22.741 "dma_device_id": "system", 00:13:22.741 "dma_device_type": 1 00:13:22.741 }, 00:13:22.741 { 00:13:22.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.741 "dma_device_type": 2 00:13:22.741 } 00:13:22.741 ], 00:13:22.741 "driver_specific": { 00:13:22.741 "passthru": { 00:13:22.741 "name": "pt1", 00:13:22.741 "base_bdev_name": "malloc1" 00:13:22.741 } 00:13:22.741 } 00:13:22.741 }' 00:13:22.741 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:23.001 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.261 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.261 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.262 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.262 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:23.262 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:23.522 "name": "pt2", 00:13:23.522 "aliases": [ 00:13:23.522 "00000000-0000-0000-0000-000000000002" 00:13:23.522 ], 00:13:23.522 "product_name": "passthru", 00:13:23.522 "block_size": 512, 00:13:23.522 "num_blocks": 65536, 00:13:23.522 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:23.522 "assigned_rate_limits": { 00:13:23.522 "rw_ios_per_sec": 0, 00:13:23.522 "rw_mbytes_per_sec": 0, 00:13:23.522 "r_mbytes_per_sec": 0, 00:13:23.522 "w_mbytes_per_sec": 0 00:13:23.522 }, 00:13:23.522 "claimed": true, 00:13:23.522 "claim_type": "exclusive_write", 00:13:23.522 "zoned": false, 00:13:23.522 "supported_io_types": { 00:13:23.522 "read": true, 00:13:23.522 "write": true, 00:13:23.522 "unmap": true, 00:13:23.522 "write_zeroes": true, 00:13:23.522 "flush": true, 00:13:23.522 "reset": true, 00:13:23.522 "compare": false, 00:13:23.522 "compare_and_write": false, 00:13:23.522 "abort": true, 00:13:23.522 "nvme_admin": false, 00:13:23.522 "nvme_io": false 00:13:23.522 }, 00:13:23.522 "memory_domains": [ 00:13:23.522 { 00:13:23.522 "dma_device_id": "system", 00:13:23.522 "dma_device_type": 1 00:13:23.522 }, 00:13:23.522 { 00:13:23.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.522 "dma_device_type": 2 00:13:23.522 } 00:13:23.522 ], 00:13:23.522 "driver_specific": { 00:13:23.522 "passthru": { 00:13:23.522 "name": "pt2", 00:13:23.522 "base_bdev_name": "malloc2" 00:13:23.522 } 00:13:23.522 } 00:13:23.522 }' 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.522 13:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:23.782 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.043 "name": "pt3", 00:13:24.043 "aliases": [ 00:13:24.043 "00000000-0000-0000-0000-000000000003" 00:13:24.043 ], 00:13:24.043 "product_name": "passthru", 00:13:24.043 "block_size": 512, 00:13:24.043 "num_blocks": 65536, 00:13:24.043 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:24.043 "assigned_rate_limits": { 00:13:24.043 "rw_ios_per_sec": 0, 00:13:24.043 "rw_mbytes_per_sec": 0, 00:13:24.043 "r_mbytes_per_sec": 0, 00:13:24.043 "w_mbytes_per_sec": 0 00:13:24.043 }, 00:13:24.043 "claimed": true, 00:13:24.043 "claim_type": "exclusive_write", 00:13:24.043 "zoned": false, 00:13:24.043 "supported_io_types": { 00:13:24.043 "read": true, 00:13:24.043 "write": true, 00:13:24.043 "unmap": true, 00:13:24.043 "write_zeroes": true, 00:13:24.043 "flush": true, 00:13:24.043 "reset": true, 00:13:24.043 "compare": false, 00:13:24.043 "compare_and_write": false, 00:13:24.043 "abort": true, 00:13:24.043 "nvme_admin": false, 00:13:24.043 "nvme_io": false 00:13:24.043 }, 00:13:24.043 "memory_domains": [ 00:13:24.043 { 00:13:24.043 "dma_device_id": "system", 00:13:24.043 "dma_device_type": 1 00:13:24.043 }, 00:13:24.043 { 00:13:24.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.043 "dma_device_type": 2 00:13:24.043 } 00:13:24.043 ], 00:13:24.043 "driver_specific": { 00:13:24.043 "passthru": { 00:13:24.043 "name": "pt3", 00:13:24.043 "base_bdev_name": "malloc3" 00:13:24.043 } 00:13:24.043 } 00:13:24.043 }' 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.043 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:24.303 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:24.562 [2024-06-10 13:41:38.848365] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:24.562 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c 00:13:24.562 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c ']' 00:13:24.562 13:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:24.822 [2024-06-10 13:41:39.052667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:24.822 [2024-06-10 13:41:39.052680] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:24.822 [2024-06-10 13:41:39.052721] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:24.822 [2024-06-10 13:41:39.052762] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:24.822 [2024-06-10 13:41:39.052773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2335f30 name raid_bdev1, state offline 00:13:24.822 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.822 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:24.822 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:24.822 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:24.822 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:24.822 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:25.082 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:25.082 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:25.342 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:25.342 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:25.602 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:25.602 13:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:25.863 [2024-06-10 13:41:40.287762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:25.863 [2024-06-10 13:41:40.288905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:25.863 [2024-06-10 13:41:40.288942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:25.863 [2024-06-10 13:41:40.288980] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:25.863 [2024-06-10 13:41:40.289017] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:25.863 [2024-06-10 13:41:40.289032] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:25.863 [2024-06-10 13:41:40.289042] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:25.863 [2024-06-10 13:41:40.289047] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233df10 name raid_bdev1, state configuring 00:13:25.863 request: 00:13:25.863 { 00:13:25.863 "name": "raid_bdev1", 00:13:25.863 "raid_level": "concat", 00:13:25.863 "base_bdevs": [ 00:13:25.863 "malloc1", 00:13:25.863 "malloc2", 00:13:25.863 "malloc3" 00:13:25.863 ], 00:13:25.863 "superblock": false, 00:13:25.863 "strip_size_kb": 64, 00:13:25.863 "method": "bdev_raid_create", 00:13:25.863 "req_id": 1 00:13:25.863 } 00:13:25.863 Got JSON-RPC error response 00:13:25.863 response: 00:13:25.863 { 00:13:25.863 "code": -17, 00:13:25.863 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:25.863 } 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.863 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:26.123 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:26.123 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:26.123 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:26.383 [2024-06-10 13:41:40.712778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:26.383 [2024-06-10 13:41:40.712809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:26.383 [2024-06-10 13:41:40.712819] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23407c0 00:13:26.383 [2024-06-10 13:41:40.712826] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:26.383 [2024-06-10 13:41:40.714155] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:26.383 [2024-06-10 13:41:40.714183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:26.383 [2024-06-10 13:41:40.714229] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:26.383 [2024-06-10 13:41:40.714247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:26.383 pt1 00:13:26.383 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:26.383 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:26.383 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.383 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:26.383 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.384 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:26.643 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.643 "name": "raid_bdev1", 00:13:26.643 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:26.643 "strip_size_kb": 64, 00:13:26.643 "state": "configuring", 00:13:26.643 "raid_level": "concat", 00:13:26.643 "superblock": true, 00:13:26.643 "num_base_bdevs": 3, 00:13:26.643 "num_base_bdevs_discovered": 1, 00:13:26.643 "num_base_bdevs_operational": 3, 00:13:26.643 "base_bdevs_list": [ 00:13:26.643 { 00:13:26.643 "name": "pt1", 00:13:26.643 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:26.643 "is_configured": true, 00:13:26.643 "data_offset": 2048, 00:13:26.643 "data_size": 63488 00:13:26.643 }, 00:13:26.643 { 00:13:26.643 "name": null, 00:13:26.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:26.643 "is_configured": false, 00:13:26.643 "data_offset": 2048, 00:13:26.643 "data_size": 63488 00:13:26.643 }, 00:13:26.643 { 00:13:26.643 "name": null, 00:13:26.643 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:26.643 "is_configured": false, 00:13:26.643 "data_offset": 2048, 00:13:26.643 "data_size": 63488 00:13:26.643 } 00:13:26.643 ] 00:13:26.643 }' 00:13:26.643 13:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.643 13:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.213 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:27.213 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:27.473 [2024-06-10 13:41:41.707304] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:27.473 [2024-06-10 13:41:41.707336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.473 [2024-06-10 13:41:41.707347] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x233d780 00:13:27.473 [2024-06-10 13:41:41.707354] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.473 [2024-06-10 13:41:41.707630] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.473 [2024-06-10 13:41:41.707642] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:27.473 [2024-06-10 13:41:41.707684] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:27.473 [2024-06-10 13:41:41.707697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:27.473 pt2 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:27.473 [2024-06-10 13:41:41.911831] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.473 13:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:27.733 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.733 "name": "raid_bdev1", 00:13:27.733 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:27.733 "strip_size_kb": 64, 00:13:27.733 "state": "configuring", 00:13:27.733 "raid_level": "concat", 00:13:27.733 "superblock": true, 00:13:27.733 "num_base_bdevs": 3, 00:13:27.733 "num_base_bdevs_discovered": 1, 00:13:27.733 "num_base_bdevs_operational": 3, 00:13:27.733 "base_bdevs_list": [ 00:13:27.733 { 00:13:27.733 "name": "pt1", 00:13:27.733 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:27.733 "is_configured": true, 00:13:27.733 "data_offset": 2048, 00:13:27.733 "data_size": 63488 00:13:27.733 }, 00:13:27.733 { 00:13:27.733 "name": null, 00:13:27.733 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:27.734 "is_configured": false, 00:13:27.734 "data_offset": 2048, 00:13:27.734 "data_size": 63488 00:13:27.734 }, 00:13:27.734 { 00:13:27.734 "name": null, 00:13:27.734 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:27.734 "is_configured": false, 00:13:27.734 "data_offset": 2048, 00:13:27.734 "data_size": 63488 00:13:27.734 } 00:13:27.734 ] 00:13:27.734 }' 00:13:27.734 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.734 13:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.304 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:28.304 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:28.304 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:28.564 [2024-06-10 13:41:42.874273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:28.564 [2024-06-10 13:41:42.874308] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.564 [2024-06-10 13:41:42.874321] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ff320 00:13:28.564 [2024-06-10 13:41:42.874327] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.564 [2024-06-10 13:41:42.874610] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.564 [2024-06-10 13:41:42.874622] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:28.564 [2024-06-10 13:41:42.874668] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:28.564 [2024-06-10 13:41:42.874680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:28.564 pt2 00:13:28.564 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:28.564 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:28.564 13:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:28.824 [2024-06-10 13:41:43.066759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:28.824 [2024-06-10 13:41:43.066778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.824 [2024-06-10 13:41:43.066789] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2334150 00:13:28.824 [2024-06-10 13:41:43.066795] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.824 [2024-06-10 13:41:43.067028] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.824 [2024-06-10 13:41:43.067040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:28.824 [2024-06-10 13:41:43.067073] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:28.824 [2024-06-10 13:41:43.067085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:28.824 [2024-06-10 13:41:43.067171] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x233f720 00:13:28.824 [2024-06-10 13:41:43.067178] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:28.824 [2024-06-10 13:41:43.067318] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23376a0 00:13:28.824 [2024-06-10 13:41:43.067419] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x233f720 00:13:28.824 [2024-06-10 13:41:43.067424] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x233f720 00:13:28.824 [2024-06-10 13:41:43.067501] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.824 pt3 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.824 "name": "raid_bdev1", 00:13:28.824 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:28.824 "strip_size_kb": 64, 00:13:28.824 "state": "online", 00:13:28.824 "raid_level": "concat", 00:13:28.824 "superblock": true, 00:13:28.824 "num_base_bdevs": 3, 00:13:28.824 "num_base_bdevs_discovered": 3, 00:13:28.824 "num_base_bdevs_operational": 3, 00:13:28.824 "base_bdevs_list": [ 00:13:28.824 { 00:13:28.824 "name": "pt1", 00:13:28.824 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:28.824 "is_configured": true, 00:13:28.824 "data_offset": 2048, 00:13:28.824 "data_size": 63488 00:13:28.824 }, 00:13:28.824 { 00:13:28.824 "name": "pt2", 00:13:28.824 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:28.824 "is_configured": true, 00:13:28.824 "data_offset": 2048, 00:13:28.824 "data_size": 63488 00:13:28.824 }, 00:13:28.824 { 00:13:28.824 "name": "pt3", 00:13:28.824 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:28.824 "is_configured": true, 00:13:28.824 "data_offset": 2048, 00:13:28.824 "data_size": 63488 00:13:28.824 } 00:13:28.824 ] 00:13:28.824 }' 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.824 13:41:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:29.393 13:41:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:29.653 [2024-06-10 13:41:44.037441] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:29.653 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:29.653 "name": "raid_bdev1", 00:13:29.653 "aliases": [ 00:13:29.653 "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c" 00:13:29.653 ], 00:13:29.653 "product_name": "Raid Volume", 00:13:29.653 "block_size": 512, 00:13:29.653 "num_blocks": 190464, 00:13:29.653 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:29.653 "assigned_rate_limits": { 00:13:29.653 "rw_ios_per_sec": 0, 00:13:29.653 "rw_mbytes_per_sec": 0, 00:13:29.653 "r_mbytes_per_sec": 0, 00:13:29.653 "w_mbytes_per_sec": 0 00:13:29.653 }, 00:13:29.653 "claimed": false, 00:13:29.653 "zoned": false, 00:13:29.653 "supported_io_types": { 00:13:29.653 "read": true, 00:13:29.653 "write": true, 00:13:29.653 "unmap": true, 00:13:29.653 "write_zeroes": true, 00:13:29.653 "flush": true, 00:13:29.653 "reset": true, 00:13:29.653 "compare": false, 00:13:29.653 "compare_and_write": false, 00:13:29.653 "abort": false, 00:13:29.653 "nvme_admin": false, 00:13:29.653 "nvme_io": false 00:13:29.653 }, 00:13:29.653 "memory_domains": [ 00:13:29.653 { 00:13:29.653 "dma_device_id": "system", 00:13:29.653 "dma_device_type": 1 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.653 "dma_device_type": 2 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "dma_device_id": "system", 00:13:29.653 "dma_device_type": 1 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.653 "dma_device_type": 2 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "dma_device_id": "system", 00:13:29.653 "dma_device_type": 1 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.653 "dma_device_type": 2 00:13:29.653 } 00:13:29.653 ], 00:13:29.653 "driver_specific": { 00:13:29.653 "raid": { 00:13:29.653 "uuid": "e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c", 00:13:29.653 "strip_size_kb": 64, 00:13:29.653 "state": "online", 00:13:29.653 "raid_level": "concat", 00:13:29.653 "superblock": true, 00:13:29.653 "num_base_bdevs": 3, 00:13:29.653 "num_base_bdevs_discovered": 3, 00:13:29.653 "num_base_bdevs_operational": 3, 00:13:29.653 "base_bdevs_list": [ 00:13:29.653 { 00:13:29.653 "name": "pt1", 00:13:29.653 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.653 "is_configured": true, 00:13:29.653 "data_offset": 2048, 00:13:29.653 "data_size": 63488 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "name": "pt2", 00:13:29.653 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.653 "is_configured": true, 00:13:29.653 "data_offset": 2048, 00:13:29.653 "data_size": 63488 00:13:29.653 }, 00:13:29.653 { 00:13:29.653 "name": "pt3", 00:13:29.653 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:29.653 "is_configured": true, 00:13:29.653 "data_offset": 2048, 00:13:29.653 "data_size": 63488 00:13:29.653 } 00:13:29.653 ] 00:13:29.653 } 00:13:29.653 } 00:13:29.653 }' 00:13:29.653 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:29.653 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:29.653 pt2 00:13:29.653 pt3' 00:13:29.653 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:29.653 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:29.653 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:29.914 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:29.914 "name": "pt1", 00:13:29.914 "aliases": [ 00:13:29.914 "00000000-0000-0000-0000-000000000001" 00:13:29.914 ], 00:13:29.914 "product_name": "passthru", 00:13:29.914 "block_size": 512, 00:13:29.914 "num_blocks": 65536, 00:13:29.914 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.914 "assigned_rate_limits": { 00:13:29.914 "rw_ios_per_sec": 0, 00:13:29.914 "rw_mbytes_per_sec": 0, 00:13:29.914 "r_mbytes_per_sec": 0, 00:13:29.914 "w_mbytes_per_sec": 0 00:13:29.914 }, 00:13:29.914 "claimed": true, 00:13:29.914 "claim_type": "exclusive_write", 00:13:29.914 "zoned": false, 00:13:29.914 "supported_io_types": { 00:13:29.914 "read": true, 00:13:29.914 "write": true, 00:13:29.914 "unmap": true, 00:13:29.914 "write_zeroes": true, 00:13:29.914 "flush": true, 00:13:29.914 "reset": true, 00:13:29.914 "compare": false, 00:13:29.914 "compare_and_write": false, 00:13:29.914 "abort": true, 00:13:29.914 "nvme_admin": false, 00:13:29.914 "nvme_io": false 00:13:29.914 }, 00:13:29.914 "memory_domains": [ 00:13:29.914 { 00:13:29.914 "dma_device_id": "system", 00:13:29.914 "dma_device_type": 1 00:13:29.914 }, 00:13:29.914 { 00:13:29.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.914 "dma_device_type": 2 00:13:29.914 } 00:13:29.914 ], 00:13:29.914 "driver_specific": { 00:13:29.914 "passthru": { 00:13:29.914 "name": "pt1", 00:13:29.914 "base_bdev_name": "malloc1" 00:13:29.914 } 00:13:29.914 } 00:13:29.914 }' 00:13:29.914 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:29.914 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.174 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.434 "name": "pt2", 00:13:30.434 "aliases": [ 00:13:30.434 "00000000-0000-0000-0000-000000000002" 00:13:30.434 ], 00:13:30.434 "product_name": "passthru", 00:13:30.434 "block_size": 512, 00:13:30.434 "num_blocks": 65536, 00:13:30.434 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:30.434 "assigned_rate_limits": { 00:13:30.434 "rw_ios_per_sec": 0, 00:13:30.434 "rw_mbytes_per_sec": 0, 00:13:30.434 "r_mbytes_per_sec": 0, 00:13:30.434 "w_mbytes_per_sec": 0 00:13:30.434 }, 00:13:30.434 "claimed": true, 00:13:30.434 "claim_type": "exclusive_write", 00:13:30.434 "zoned": false, 00:13:30.434 "supported_io_types": { 00:13:30.434 "read": true, 00:13:30.434 "write": true, 00:13:30.434 "unmap": true, 00:13:30.434 "write_zeroes": true, 00:13:30.434 "flush": true, 00:13:30.434 "reset": true, 00:13:30.434 "compare": false, 00:13:30.434 "compare_and_write": false, 00:13:30.434 "abort": true, 00:13:30.434 "nvme_admin": false, 00:13:30.434 "nvme_io": false 00:13:30.434 }, 00:13:30.434 "memory_domains": [ 00:13:30.434 { 00:13:30.434 "dma_device_id": "system", 00:13:30.434 "dma_device_type": 1 00:13:30.434 }, 00:13:30.434 { 00:13:30.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.434 "dma_device_type": 2 00:13:30.434 } 00:13:30.434 ], 00:13:30.434 "driver_specific": { 00:13:30.434 "passthru": { 00:13:30.434 "name": "pt2", 00:13:30.434 "base_bdev_name": "malloc2" 00:13:30.434 } 00:13:30.434 } 00:13:30.434 }' 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.434 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.695 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.695 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.695 13:41:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.695 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.695 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.695 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.695 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.695 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.695 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.955 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.955 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.955 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:30.955 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.955 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.955 "name": "pt3", 00:13:30.955 "aliases": [ 00:13:30.955 "00000000-0000-0000-0000-000000000003" 00:13:30.955 ], 00:13:30.955 "product_name": "passthru", 00:13:30.955 "block_size": 512, 00:13:30.955 "num_blocks": 65536, 00:13:30.955 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:30.955 "assigned_rate_limits": { 00:13:30.955 "rw_ios_per_sec": 0, 00:13:30.955 "rw_mbytes_per_sec": 0, 00:13:30.955 "r_mbytes_per_sec": 0, 00:13:30.955 "w_mbytes_per_sec": 0 00:13:30.955 }, 00:13:30.955 "claimed": true, 00:13:30.955 "claim_type": "exclusive_write", 00:13:30.955 "zoned": false, 00:13:30.955 "supported_io_types": { 00:13:30.955 "read": true, 00:13:30.955 "write": true, 00:13:30.955 "unmap": true, 00:13:30.955 "write_zeroes": true, 00:13:30.955 "flush": true, 00:13:30.955 "reset": true, 00:13:30.955 "compare": false, 00:13:30.955 "compare_and_write": false, 00:13:30.955 "abort": true, 00:13:30.955 "nvme_admin": false, 00:13:30.955 "nvme_io": false 00:13:30.955 }, 00:13:30.955 "memory_domains": [ 00:13:30.955 { 00:13:30.955 "dma_device_id": "system", 00:13:30.955 "dma_device_type": 1 00:13:30.955 }, 00:13:30.955 { 00:13:30.955 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.955 "dma_device_type": 2 00:13:30.955 } 00:13:30.955 ], 00:13:30.955 "driver_specific": { 00:13:30.955 "passthru": { 00:13:30.955 "name": "pt3", 00:13:30.955 "base_bdev_name": "malloc3" 00:13:30.955 } 00:13:30.955 } 00:13:30.955 }' 00:13:30.955 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.222 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:31.530 [2024-06-10 13:41:45.958313] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c '!=' e13fc2cd-1a7a-48c6-8ba9-5008e2c7180c ']' 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1539911 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1539911 ']' 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1539911 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:31.530 13:41:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1539911 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1539911' 00:13:31.825 killing process with pid 1539911 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1539911 00:13:31.825 [2024-06-10 13:41:46.023565] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:31.825 [2024-06-10 13:41:46.023608] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:31.825 [2024-06-10 13:41:46.023653] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:31.825 [2024-06-10 13:41:46.023660] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x233f720 name raid_bdev1, state offline 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1539911 00:13:31.825 [2024-06-10 13:41:46.039302] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:31.825 00:13:31.825 real 0m12.547s 00:13:31.825 user 0m23.075s 00:13:31.825 sys 0m1.832s 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:31.825 13:41:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.825 ************************************ 00:13:31.825 END TEST raid_superblock_test 00:13:31.825 ************************************ 00:13:31.825 13:41:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:13:31.825 13:41:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:31.825 13:41:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:31.825 13:41:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:31.825 ************************************ 00:13:31.825 START TEST raid_read_error_test 00:13:31.825 ************************************ 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 read 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HHzHnEcXfY 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1542650 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1542650 /var/tmp/spdk-raid.sock 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1542650 ']' 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:31.825 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:31.825 13:41:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.085 [2024-06-10 13:41:46.310522] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:13:32.085 [2024-06-10 13:41:46.310578] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1542650 ] 00:13:32.085 [2024-06-10 13:41:46.405136] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:32.085 [2024-06-10 13:41:46.483157] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:13:32.085 [2024-06-10 13:41:46.532505] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:32.085 [2024-06-10 13:41:46.532535] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.025 13:41:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:33.025 13:41:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:13:33.025 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:33.025 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:33.025 BaseBdev1_malloc 00:13:33.025 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:33.285 true 00:13:33.285 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:33.285 [2024-06-10 13:41:47.756154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:33.285 [2024-06-10 13:41:47.756192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:33.285 [2024-06-10 13:41:47.756204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c8cc90 00:13:33.285 [2024-06-10 13:41:47.756211] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:33.285 [2024-06-10 13:41:47.757635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:33.285 [2024-06-10 13:41:47.757655] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:33.285 BaseBdev1 00:13:33.545 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:33.545 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:33.545 BaseBdev2_malloc 00:13:33.545 13:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:33.807 true 00:13:33.807 13:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:34.068 [2024-06-10 13:41:48.335629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:34.068 [2024-06-10 13:41:48.335657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.068 [2024-06-10 13:41:48.335669] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c91400 00:13:34.068 [2024-06-10 13:41:48.335676] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.068 [2024-06-10 13:41:48.336917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.068 [2024-06-10 13:41:48.336937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:34.068 BaseBdev2 00:13:34.068 13:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:34.068 13:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:34.068 BaseBdev3_malloc 00:13:34.328 13:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:34.328 true 00:13:34.328 13:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:34.588 [2024-06-10 13:41:48.914916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:34.589 [2024-06-10 13:41:48.914943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.589 [2024-06-10 13:41:48.914956] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c93fc0 00:13:34.589 [2024-06-10 13:41:48.914962] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.589 [2024-06-10 13:41:48.916188] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.589 [2024-06-10 13:41:48.916207] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:34.589 BaseBdev3 00:13:34.589 13:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:34.849 [2024-06-10 13:41:49.107426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:34.849 [2024-06-10 13:41:49.108473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:34.849 [2024-06-10 13:41:49.108528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:34.849 [2024-06-10 13:41:49.108695] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c92060 00:13:34.849 [2024-06-10 13:41:49.108703] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:34.849 [2024-06-10 13:41:49.108853] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae2ee0 00:13:34.849 [2024-06-10 13:41:49.108972] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c92060 00:13:34.849 [2024-06-10 13:41:49.108977] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c92060 00:13:34.849 [2024-06-10 13:41:49.109054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.849 "name": "raid_bdev1", 00:13:34.849 "uuid": "5c93bd04-65be-43ea-8d3f-f762cc6167fb", 00:13:34.849 "strip_size_kb": 64, 00:13:34.849 "state": "online", 00:13:34.849 "raid_level": "concat", 00:13:34.849 "superblock": true, 00:13:34.849 "num_base_bdevs": 3, 00:13:34.849 "num_base_bdevs_discovered": 3, 00:13:34.849 "num_base_bdevs_operational": 3, 00:13:34.849 "base_bdevs_list": [ 00:13:34.849 { 00:13:34.849 "name": "BaseBdev1", 00:13:34.849 "uuid": "e343cee3-c42b-534e-ad87-a03f240df21a", 00:13:34.849 "is_configured": true, 00:13:34.849 "data_offset": 2048, 00:13:34.849 "data_size": 63488 00:13:34.849 }, 00:13:34.849 { 00:13:34.849 "name": "BaseBdev2", 00:13:34.849 "uuid": "9ea64bc1-9bcb-5d82-90ca-b71b07baf06c", 00:13:34.849 "is_configured": true, 00:13:34.849 "data_offset": 2048, 00:13:34.849 "data_size": 63488 00:13:34.849 }, 00:13:34.849 { 00:13:34.849 "name": "BaseBdev3", 00:13:34.849 "uuid": "ea77e5cf-1545-59b2-8218-6727e7e5afea", 00:13:34.849 "is_configured": true, 00:13:34.849 "data_offset": 2048, 00:13:34.849 "data_size": 63488 00:13:34.849 } 00:13:34.849 ] 00:13:34.849 }' 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.849 13:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.422 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:35.422 13:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:35.682 [2024-06-10 13:41:49.969806] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ecde0 00:13:36.624 13:41:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.624 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:36.885 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.885 "name": "raid_bdev1", 00:13:36.885 "uuid": "5c93bd04-65be-43ea-8d3f-f762cc6167fb", 00:13:36.885 "strip_size_kb": 64, 00:13:36.885 "state": "online", 00:13:36.885 "raid_level": "concat", 00:13:36.885 "superblock": true, 00:13:36.885 "num_base_bdevs": 3, 00:13:36.885 "num_base_bdevs_discovered": 3, 00:13:36.885 "num_base_bdevs_operational": 3, 00:13:36.885 "base_bdevs_list": [ 00:13:36.885 { 00:13:36.885 "name": "BaseBdev1", 00:13:36.885 "uuid": "e343cee3-c42b-534e-ad87-a03f240df21a", 00:13:36.885 "is_configured": true, 00:13:36.885 "data_offset": 2048, 00:13:36.885 "data_size": 63488 00:13:36.885 }, 00:13:36.885 { 00:13:36.885 "name": "BaseBdev2", 00:13:36.885 "uuid": "9ea64bc1-9bcb-5d82-90ca-b71b07baf06c", 00:13:36.885 "is_configured": true, 00:13:36.885 "data_offset": 2048, 00:13:36.885 "data_size": 63488 00:13:36.885 }, 00:13:36.885 { 00:13:36.885 "name": "BaseBdev3", 00:13:36.885 "uuid": "ea77e5cf-1545-59b2-8218-6727e7e5afea", 00:13:36.885 "is_configured": true, 00:13:36.885 "data_offset": 2048, 00:13:36.885 "data_size": 63488 00:13:36.885 } 00:13:36.885 ] 00:13:36.885 }' 00:13:36.885 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.885 13:41:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.455 13:41:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:37.715 [2024-06-10 13:41:52.034057] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:37.715 [2024-06-10 13:41:52.034092] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:37.715 [2024-06-10 13:41:52.036903] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:37.715 [2024-06-10 13:41:52.036934] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:37.715 [2024-06-10 13:41:52.036959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:37.716 [2024-06-10 13:41:52.036966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c92060 name raid_bdev1, state offline 00:13:37.716 0 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1542650 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1542650 ']' 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1542650 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1542650 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1542650' 00:13:37.716 killing process with pid 1542650 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1542650 00:13:37.716 [2024-06-10 13:41:52.105777] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:37.716 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1542650 00:13:37.716 [2024-06-10 13:41:52.116985] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HHzHnEcXfY 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:13:37.977 00:13:37.977 real 0m6.017s 00:13:37.977 user 0m9.645s 00:13:37.977 sys 0m0.827s 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:37.977 13:41:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.977 ************************************ 00:13:37.977 END TEST raid_read_error_test 00:13:37.977 ************************************ 00:13:37.977 13:41:52 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:13:37.977 13:41:52 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:37.977 13:41:52 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:37.977 13:41:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.977 ************************************ 00:13:37.977 START TEST raid_write_error_test 00:13:37.977 ************************************ 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 write 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.82fKdHDuWh 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1543934 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1543934 /var/tmp/spdk-raid.sock 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1543934 ']' 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.977 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:37.977 13:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.977 [2024-06-10 13:41:52.410787] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:13:37.977 [2024-06-10 13:41:52.410836] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1543934 ] 00:13:38.237 [2024-06-10 13:41:52.497818] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:38.237 [2024-06-10 13:41:52.565590] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:13:38.237 [2024-06-10 13:41:52.607448] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.237 [2024-06-10 13:41:52.607471] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.808 13:41:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:38.808 13:41:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:13:38.808 13:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:38.808 13:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:39.068 BaseBdev1_malloc 00:13:39.068 13:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:39.328 true 00:13:39.328 13:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:39.588 [2024-06-10 13:41:53.851120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:39.588 [2024-06-10 13:41:53.851152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.588 [2024-06-10 13:41:53.851167] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1602c90 00:13:39.588 [2024-06-10 13:41:53.851174] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.588 [2024-06-10 13:41:53.852632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.588 [2024-06-10 13:41:53.852654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:39.588 BaseBdev1 00:13:39.588 13:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:39.588 13:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:39.588 BaseBdev2_malloc 00:13:39.588 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:39.848 true 00:13:39.848 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:40.108 [2024-06-10 13:41:54.418588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:40.109 [2024-06-10 13:41:54.418615] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:40.109 [2024-06-10 13:41:54.418627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1607400 00:13:40.109 [2024-06-10 13:41:54.418635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:40.109 [2024-06-10 13:41:54.419885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:40.109 [2024-06-10 13:41:54.419905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:40.109 BaseBdev2 00:13:40.109 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:40.109 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:40.370 BaseBdev3_malloc 00:13:40.370 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:40.370 true 00:13:40.370 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:40.630 [2024-06-10 13:41:54.986045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:40.630 [2024-06-10 13:41:54.986071] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:40.630 [2024-06-10 13:41:54.986085] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1609fc0 00:13:40.630 [2024-06-10 13:41:54.986091] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:40.630 [2024-06-10 13:41:54.987354] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:40.630 [2024-06-10 13:41:54.987378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:40.630 BaseBdev3 00:13:40.630 13:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:40.890 [2024-06-10 13:41:55.186576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:40.890 [2024-06-10 13:41:55.187656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:40.890 [2024-06-10 13:41:55.187713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:40.890 [2024-06-10 13:41:55.187885] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1608060 00:13:40.890 [2024-06-10 13:41:55.187893] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:40.890 [2024-06-10 13:41:55.188046] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1458ee0 00:13:40.890 [2024-06-10 13:41:55.188174] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1608060 00:13:40.890 [2024-06-10 13:41:55.188181] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1608060 00:13:40.890 [2024-06-10 13:41:55.188260] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.890 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:41.151 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.151 "name": "raid_bdev1", 00:13:41.151 "uuid": "1014e0e3-0a9f-4f7b-af20-83d3be1ecf33", 00:13:41.151 "strip_size_kb": 64, 00:13:41.151 "state": "online", 00:13:41.151 "raid_level": "concat", 00:13:41.151 "superblock": true, 00:13:41.151 "num_base_bdevs": 3, 00:13:41.151 "num_base_bdevs_discovered": 3, 00:13:41.151 "num_base_bdevs_operational": 3, 00:13:41.151 "base_bdevs_list": [ 00:13:41.151 { 00:13:41.151 "name": "BaseBdev1", 00:13:41.151 "uuid": "a7807ea8-8da5-5854-bbdd-eb34949d69fa", 00:13:41.151 "is_configured": true, 00:13:41.151 "data_offset": 2048, 00:13:41.151 "data_size": 63488 00:13:41.151 }, 00:13:41.151 { 00:13:41.151 "name": "BaseBdev2", 00:13:41.151 "uuid": "d8ece60b-7acd-5b49-96b9-266064dee197", 00:13:41.151 "is_configured": true, 00:13:41.151 "data_offset": 2048, 00:13:41.151 "data_size": 63488 00:13:41.151 }, 00:13:41.151 { 00:13:41.151 "name": "BaseBdev3", 00:13:41.151 "uuid": "15011f31-c94c-56f3-bb4c-1b1d6c16f834", 00:13:41.151 "is_configured": true, 00:13:41.151 "data_offset": 2048, 00:13:41.151 "data_size": 63488 00:13:41.151 } 00:13:41.151 ] 00:13:41.151 }' 00:13:41.151 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.151 13:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.721 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:41.721 13:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:41.721 [2024-06-10 13:41:56.032915] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1162de0 00:13:42.662 13:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.923 "name": "raid_bdev1", 00:13:42.923 "uuid": "1014e0e3-0a9f-4f7b-af20-83d3be1ecf33", 00:13:42.923 "strip_size_kb": 64, 00:13:42.923 "state": "online", 00:13:42.923 "raid_level": "concat", 00:13:42.923 "superblock": true, 00:13:42.923 "num_base_bdevs": 3, 00:13:42.923 "num_base_bdevs_discovered": 3, 00:13:42.923 "num_base_bdevs_operational": 3, 00:13:42.923 "base_bdevs_list": [ 00:13:42.923 { 00:13:42.923 "name": "BaseBdev1", 00:13:42.923 "uuid": "a7807ea8-8da5-5854-bbdd-eb34949d69fa", 00:13:42.923 "is_configured": true, 00:13:42.923 "data_offset": 2048, 00:13:42.923 "data_size": 63488 00:13:42.923 }, 00:13:42.923 { 00:13:42.923 "name": "BaseBdev2", 00:13:42.923 "uuid": "d8ece60b-7acd-5b49-96b9-266064dee197", 00:13:42.923 "is_configured": true, 00:13:42.923 "data_offset": 2048, 00:13:42.923 "data_size": 63488 00:13:42.923 }, 00:13:42.923 { 00:13:42.923 "name": "BaseBdev3", 00:13:42.923 "uuid": "15011f31-c94c-56f3-bb4c-1b1d6c16f834", 00:13:42.923 "is_configured": true, 00:13:42.923 "data_offset": 2048, 00:13:42.923 "data_size": 63488 00:13:42.923 } 00:13:42.923 ] 00:13:42.923 }' 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.923 13:41:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.494 13:41:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:43.755 [2024-06-10 13:41:58.043249] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:43.755 [2024-06-10 13:41:58.043278] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.755 [2024-06-10 13:41:58.046076] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.755 [2024-06-10 13:41:58.046104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:43.755 [2024-06-10 13:41:58.046128] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.755 [2024-06-10 13:41:58.046134] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1608060 name raid_bdev1, state offline 00:13:43.755 0 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1543934 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1543934 ']' 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1543934 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1543934 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1543934' 00:13:43.755 killing process with pid 1543934 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1543934 00:13:43.755 [2024-06-10 13:41:58.112773] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:43.755 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1543934 00:13:43.755 [2024-06-10 13:41:58.123746] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.82fKdHDuWh 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:13:44.016 00:13:44.016 real 0m5.926s 00:13:44.016 user 0m9.514s 00:13:44.016 sys 0m0.764s 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:44.016 13:41:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.016 ************************************ 00:13:44.016 END TEST raid_write_error_test 00:13:44.016 ************************************ 00:13:44.016 13:41:58 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:44.016 13:41:58 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:13:44.016 13:41:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:44.016 13:41:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:44.016 13:41:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:44.016 ************************************ 00:13:44.016 START TEST raid_state_function_test 00:13:44.016 ************************************ 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 false 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1545301 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1545301' 00:13:44.016 Process raid pid: 1545301 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1545301 /var/tmp/spdk-raid.sock 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1545301 ']' 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:44.016 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:44.016 13:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.016 [2024-06-10 13:41:58.404400] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:13:44.016 [2024-06-10 13:41:58.404453] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:44.277 [2024-06-10 13:41:58.494225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:44.277 [2024-06-10 13:41:58.563547] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:13:44.277 [2024-06-10 13:41:58.603612] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.277 [2024-06-10 13:41:58.603633] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.847 13:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:44.847 13:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:13:44.847 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.107 [2024-06-10 13:41:59.451377] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.107 [2024-06-10 13:41:59.451406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.107 [2024-06-10 13:41:59.451412] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.107 [2024-06-10 13:41:59.451419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.107 [2024-06-10 13:41:59.451428] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.107 [2024-06-10 13:41:59.451434] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.107 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.368 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.368 "name": "Existed_Raid", 00:13:45.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.368 "strip_size_kb": 0, 00:13:45.368 "state": "configuring", 00:13:45.368 "raid_level": "raid1", 00:13:45.368 "superblock": false, 00:13:45.368 "num_base_bdevs": 3, 00:13:45.368 "num_base_bdevs_discovered": 0, 00:13:45.368 "num_base_bdevs_operational": 3, 00:13:45.368 "base_bdevs_list": [ 00:13:45.368 { 00:13:45.368 "name": "BaseBdev1", 00:13:45.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.368 "is_configured": false, 00:13:45.368 "data_offset": 0, 00:13:45.368 "data_size": 0 00:13:45.368 }, 00:13:45.368 { 00:13:45.368 "name": "BaseBdev2", 00:13:45.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.368 "is_configured": false, 00:13:45.368 "data_offset": 0, 00:13:45.368 "data_size": 0 00:13:45.368 }, 00:13:45.368 { 00:13:45.368 "name": "BaseBdev3", 00:13:45.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.368 "is_configured": false, 00:13:45.368 "data_offset": 0, 00:13:45.368 "data_size": 0 00:13:45.368 } 00:13:45.368 ] 00:13:45.368 }' 00:13:45.368 13:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.368 13:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.939 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:45.939 [2024-06-10 13:42:00.393659] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:45.939 [2024-06-10 13:42:00.393683] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe65740 name Existed_Raid, state configuring 00:13:45.939 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:46.200 [2024-06-10 13:42:00.590158] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:46.200 [2024-06-10 13:42:00.590181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:46.200 [2024-06-10 13:42:00.590187] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.200 [2024-06-10 13:42:00.590193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:46.200 [2024-06-10 13:42:00.590198] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:46.200 [2024-06-10 13:42:00.590204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:46.200 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:46.460 [2024-06-10 13:42:00.793536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:46.460 BaseBdev1 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:46.460 13:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:46.721 13:42:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:46.981 [ 00:13:46.981 { 00:13:46.981 "name": "BaseBdev1", 00:13:46.981 "aliases": [ 00:13:46.981 "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6" 00:13:46.981 ], 00:13:46.981 "product_name": "Malloc disk", 00:13:46.981 "block_size": 512, 00:13:46.981 "num_blocks": 65536, 00:13:46.981 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:46.981 "assigned_rate_limits": { 00:13:46.981 "rw_ios_per_sec": 0, 00:13:46.981 "rw_mbytes_per_sec": 0, 00:13:46.981 "r_mbytes_per_sec": 0, 00:13:46.981 "w_mbytes_per_sec": 0 00:13:46.981 }, 00:13:46.981 "claimed": true, 00:13:46.981 "claim_type": "exclusive_write", 00:13:46.981 "zoned": false, 00:13:46.981 "supported_io_types": { 00:13:46.981 "read": true, 00:13:46.981 "write": true, 00:13:46.981 "unmap": true, 00:13:46.981 "write_zeroes": true, 00:13:46.981 "flush": true, 00:13:46.981 "reset": true, 00:13:46.981 "compare": false, 00:13:46.981 "compare_and_write": false, 00:13:46.981 "abort": true, 00:13:46.981 "nvme_admin": false, 00:13:46.981 "nvme_io": false 00:13:46.981 }, 00:13:46.981 "memory_domains": [ 00:13:46.981 { 00:13:46.981 "dma_device_id": "system", 00:13:46.981 "dma_device_type": 1 00:13:46.981 }, 00:13:46.981 { 00:13:46.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.981 "dma_device_type": 2 00:13:46.981 } 00:13:46.981 ], 00:13:46.981 "driver_specific": {} 00:13:46.981 } 00:13:46.981 ] 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.981 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.982 "name": "Existed_Raid", 00:13:46.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.982 "strip_size_kb": 0, 00:13:46.982 "state": "configuring", 00:13:46.982 "raid_level": "raid1", 00:13:46.982 "superblock": false, 00:13:46.982 "num_base_bdevs": 3, 00:13:46.982 "num_base_bdevs_discovered": 1, 00:13:46.982 "num_base_bdevs_operational": 3, 00:13:46.982 "base_bdevs_list": [ 00:13:46.982 { 00:13:46.982 "name": "BaseBdev1", 00:13:46.982 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:46.982 "is_configured": true, 00:13:46.982 "data_offset": 0, 00:13:46.982 "data_size": 65536 00:13:46.982 }, 00:13:46.982 { 00:13:46.982 "name": "BaseBdev2", 00:13:46.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.982 "is_configured": false, 00:13:46.982 "data_offset": 0, 00:13:46.982 "data_size": 0 00:13:46.982 }, 00:13:46.982 { 00:13:46.982 "name": "BaseBdev3", 00:13:46.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.982 "is_configured": false, 00:13:46.982 "data_offset": 0, 00:13:46.982 "data_size": 0 00:13:46.982 } 00:13:46.982 ] 00:13:46.982 }' 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.982 13:42:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.552 13:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:47.811 [2024-06-10 13:42:02.092821] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:47.811 [2024-06-10 13:42:02.092849] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe65010 name Existed_Raid, state configuring 00:13:47.811 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:48.070 [2024-06-10 13:42:02.297362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:48.070 [2024-06-10 13:42:02.298570] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:48.070 [2024-06-10 13:42:02.298595] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:48.070 [2024-06-10 13:42:02.298601] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:48.070 [2024-06-10 13:42:02.298607] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.070 "name": "Existed_Raid", 00:13:48.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.070 "strip_size_kb": 0, 00:13:48.070 "state": "configuring", 00:13:48.070 "raid_level": "raid1", 00:13:48.070 "superblock": false, 00:13:48.070 "num_base_bdevs": 3, 00:13:48.070 "num_base_bdevs_discovered": 1, 00:13:48.070 "num_base_bdevs_operational": 3, 00:13:48.070 "base_bdevs_list": [ 00:13:48.070 { 00:13:48.070 "name": "BaseBdev1", 00:13:48.070 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:48.070 "is_configured": true, 00:13:48.070 "data_offset": 0, 00:13:48.070 "data_size": 65536 00:13:48.070 }, 00:13:48.070 { 00:13:48.070 "name": "BaseBdev2", 00:13:48.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.070 "is_configured": false, 00:13:48.070 "data_offset": 0, 00:13:48.070 "data_size": 0 00:13:48.070 }, 00:13:48.070 { 00:13:48.070 "name": "BaseBdev3", 00:13:48.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.070 "is_configured": false, 00:13:48.070 "data_offset": 0, 00:13:48.070 "data_size": 0 00:13:48.070 } 00:13:48.070 ] 00:13:48.070 }' 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.070 13:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.641 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:48.901 [2024-06-10 13:42:03.216771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:48.901 BaseBdev2 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:48.901 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:49.161 [ 00:13:49.161 { 00:13:49.161 "name": "BaseBdev2", 00:13:49.161 "aliases": [ 00:13:49.161 "c3e405b3-0370-42a6-b40e-d1e9af5db7b1" 00:13:49.161 ], 00:13:49.161 "product_name": "Malloc disk", 00:13:49.161 "block_size": 512, 00:13:49.161 "num_blocks": 65536, 00:13:49.161 "uuid": "c3e405b3-0370-42a6-b40e-d1e9af5db7b1", 00:13:49.161 "assigned_rate_limits": { 00:13:49.161 "rw_ios_per_sec": 0, 00:13:49.161 "rw_mbytes_per_sec": 0, 00:13:49.161 "r_mbytes_per_sec": 0, 00:13:49.161 "w_mbytes_per_sec": 0 00:13:49.161 }, 00:13:49.161 "claimed": true, 00:13:49.161 "claim_type": "exclusive_write", 00:13:49.161 "zoned": false, 00:13:49.161 "supported_io_types": { 00:13:49.161 "read": true, 00:13:49.161 "write": true, 00:13:49.161 "unmap": true, 00:13:49.161 "write_zeroes": true, 00:13:49.161 "flush": true, 00:13:49.161 "reset": true, 00:13:49.161 "compare": false, 00:13:49.161 "compare_and_write": false, 00:13:49.161 "abort": true, 00:13:49.161 "nvme_admin": false, 00:13:49.161 "nvme_io": false 00:13:49.161 }, 00:13:49.161 "memory_domains": [ 00:13:49.161 { 00:13:49.161 "dma_device_id": "system", 00:13:49.161 "dma_device_type": 1 00:13:49.161 }, 00:13:49.161 { 00:13:49.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.161 "dma_device_type": 2 00:13:49.161 } 00:13:49.161 ], 00:13:49.161 "driver_specific": {} 00:13:49.161 } 00:13:49.161 ] 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.161 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.421 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.421 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.421 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.421 "name": "Existed_Raid", 00:13:49.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.421 "strip_size_kb": 0, 00:13:49.421 "state": "configuring", 00:13:49.421 "raid_level": "raid1", 00:13:49.421 "superblock": false, 00:13:49.421 "num_base_bdevs": 3, 00:13:49.421 "num_base_bdevs_discovered": 2, 00:13:49.421 "num_base_bdevs_operational": 3, 00:13:49.421 "base_bdevs_list": [ 00:13:49.421 { 00:13:49.421 "name": "BaseBdev1", 00:13:49.421 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:49.421 "is_configured": true, 00:13:49.421 "data_offset": 0, 00:13:49.421 "data_size": 65536 00:13:49.421 }, 00:13:49.421 { 00:13:49.421 "name": "BaseBdev2", 00:13:49.421 "uuid": "c3e405b3-0370-42a6-b40e-d1e9af5db7b1", 00:13:49.421 "is_configured": true, 00:13:49.421 "data_offset": 0, 00:13:49.421 "data_size": 65536 00:13:49.421 }, 00:13:49.421 { 00:13:49.421 "name": "BaseBdev3", 00:13:49.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.421 "is_configured": false, 00:13:49.421 "data_offset": 0, 00:13:49.421 "data_size": 0 00:13:49.421 } 00:13:49.421 ] 00:13:49.421 }' 00:13:49.421 13:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.421 13:42:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.991 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:50.252 [2024-06-10 13:42:04.561316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:50.252 [2024-06-10 13:42:04.561341] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe65f00 00:13:50.252 [2024-06-10 13:42:04.561346] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:50.252 [2024-06-10 13:42:04.561507] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe7cdf0 00:13:50.252 [2024-06-10 13:42:04.561609] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe65f00 00:13:50.252 [2024-06-10 13:42:04.561616] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe65f00 00:13:50.252 [2024-06-10 13:42:04.561738] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.252 BaseBdev3 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:50.252 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:50.513 [ 00:13:50.513 { 00:13:50.513 "name": "BaseBdev3", 00:13:50.513 "aliases": [ 00:13:50.513 "d9c974c7-6d6c-41d2-a226-620b819f0045" 00:13:50.513 ], 00:13:50.513 "product_name": "Malloc disk", 00:13:50.513 "block_size": 512, 00:13:50.513 "num_blocks": 65536, 00:13:50.513 "uuid": "d9c974c7-6d6c-41d2-a226-620b819f0045", 00:13:50.513 "assigned_rate_limits": { 00:13:50.513 "rw_ios_per_sec": 0, 00:13:50.513 "rw_mbytes_per_sec": 0, 00:13:50.513 "r_mbytes_per_sec": 0, 00:13:50.513 "w_mbytes_per_sec": 0 00:13:50.513 }, 00:13:50.513 "claimed": true, 00:13:50.513 "claim_type": "exclusive_write", 00:13:50.513 "zoned": false, 00:13:50.513 "supported_io_types": { 00:13:50.513 "read": true, 00:13:50.513 "write": true, 00:13:50.513 "unmap": true, 00:13:50.513 "write_zeroes": true, 00:13:50.513 "flush": true, 00:13:50.513 "reset": true, 00:13:50.513 "compare": false, 00:13:50.513 "compare_and_write": false, 00:13:50.513 "abort": true, 00:13:50.513 "nvme_admin": false, 00:13:50.513 "nvme_io": false 00:13:50.513 }, 00:13:50.513 "memory_domains": [ 00:13:50.513 { 00:13:50.513 "dma_device_id": "system", 00:13:50.513 "dma_device_type": 1 00:13:50.513 }, 00:13:50.513 { 00:13:50.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.513 "dma_device_type": 2 00:13:50.513 } 00:13:50.513 ], 00:13:50.513 "driver_specific": {} 00:13:50.513 } 00:13:50.513 ] 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.513 13:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.773 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.773 "name": "Existed_Raid", 00:13:50.773 "uuid": "5a0c19cd-fc5f-4731-8561-2fbc4a8737dc", 00:13:50.773 "strip_size_kb": 0, 00:13:50.773 "state": "online", 00:13:50.773 "raid_level": "raid1", 00:13:50.773 "superblock": false, 00:13:50.773 "num_base_bdevs": 3, 00:13:50.773 "num_base_bdevs_discovered": 3, 00:13:50.773 "num_base_bdevs_operational": 3, 00:13:50.773 "base_bdevs_list": [ 00:13:50.773 { 00:13:50.773 "name": "BaseBdev1", 00:13:50.773 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:50.773 "is_configured": true, 00:13:50.773 "data_offset": 0, 00:13:50.773 "data_size": 65536 00:13:50.773 }, 00:13:50.773 { 00:13:50.774 "name": "BaseBdev2", 00:13:50.774 "uuid": "c3e405b3-0370-42a6-b40e-d1e9af5db7b1", 00:13:50.774 "is_configured": true, 00:13:50.774 "data_offset": 0, 00:13:50.774 "data_size": 65536 00:13:50.774 }, 00:13:50.774 { 00:13:50.774 "name": "BaseBdev3", 00:13:50.774 "uuid": "d9c974c7-6d6c-41d2-a226-620b819f0045", 00:13:50.774 "is_configured": true, 00:13:50.774 "data_offset": 0, 00:13:50.774 "data_size": 65536 00:13:50.774 } 00:13:50.774 ] 00:13:50.774 }' 00:13:50.774 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.774 13:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:51.344 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:51.605 [2024-06-10 13:42:05.924996] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:51.605 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:51.605 "name": "Existed_Raid", 00:13:51.605 "aliases": [ 00:13:51.605 "5a0c19cd-fc5f-4731-8561-2fbc4a8737dc" 00:13:51.605 ], 00:13:51.605 "product_name": "Raid Volume", 00:13:51.605 "block_size": 512, 00:13:51.605 "num_blocks": 65536, 00:13:51.605 "uuid": "5a0c19cd-fc5f-4731-8561-2fbc4a8737dc", 00:13:51.605 "assigned_rate_limits": { 00:13:51.605 "rw_ios_per_sec": 0, 00:13:51.605 "rw_mbytes_per_sec": 0, 00:13:51.605 "r_mbytes_per_sec": 0, 00:13:51.605 "w_mbytes_per_sec": 0 00:13:51.605 }, 00:13:51.605 "claimed": false, 00:13:51.605 "zoned": false, 00:13:51.605 "supported_io_types": { 00:13:51.605 "read": true, 00:13:51.605 "write": true, 00:13:51.605 "unmap": false, 00:13:51.605 "write_zeroes": true, 00:13:51.605 "flush": false, 00:13:51.605 "reset": true, 00:13:51.605 "compare": false, 00:13:51.605 "compare_and_write": false, 00:13:51.605 "abort": false, 00:13:51.605 "nvme_admin": false, 00:13:51.605 "nvme_io": false 00:13:51.605 }, 00:13:51.605 "memory_domains": [ 00:13:51.605 { 00:13:51.605 "dma_device_id": "system", 00:13:51.605 "dma_device_type": 1 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.605 "dma_device_type": 2 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "dma_device_id": "system", 00:13:51.605 "dma_device_type": 1 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.605 "dma_device_type": 2 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "dma_device_id": "system", 00:13:51.605 "dma_device_type": 1 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.605 "dma_device_type": 2 00:13:51.605 } 00:13:51.605 ], 00:13:51.605 "driver_specific": { 00:13:51.605 "raid": { 00:13:51.605 "uuid": "5a0c19cd-fc5f-4731-8561-2fbc4a8737dc", 00:13:51.605 "strip_size_kb": 0, 00:13:51.605 "state": "online", 00:13:51.605 "raid_level": "raid1", 00:13:51.605 "superblock": false, 00:13:51.605 "num_base_bdevs": 3, 00:13:51.605 "num_base_bdevs_discovered": 3, 00:13:51.605 "num_base_bdevs_operational": 3, 00:13:51.605 "base_bdevs_list": [ 00:13:51.605 { 00:13:51.605 "name": "BaseBdev1", 00:13:51.605 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:51.605 "is_configured": true, 00:13:51.605 "data_offset": 0, 00:13:51.605 "data_size": 65536 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "name": "BaseBdev2", 00:13:51.605 "uuid": "c3e405b3-0370-42a6-b40e-d1e9af5db7b1", 00:13:51.605 "is_configured": true, 00:13:51.605 "data_offset": 0, 00:13:51.605 "data_size": 65536 00:13:51.605 }, 00:13:51.605 { 00:13:51.605 "name": "BaseBdev3", 00:13:51.605 "uuid": "d9c974c7-6d6c-41d2-a226-620b819f0045", 00:13:51.605 "is_configured": true, 00:13:51.605 "data_offset": 0, 00:13:51.605 "data_size": 65536 00:13:51.605 } 00:13:51.605 ] 00:13:51.605 } 00:13:51.605 } 00:13:51.605 }' 00:13:51.605 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:51.605 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:51.605 BaseBdev2 00:13:51.605 BaseBdev3' 00:13:51.605 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.605 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:51.605 13:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.867 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:51.867 "name": "BaseBdev1", 00:13:51.867 "aliases": [ 00:13:51.867 "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6" 00:13:51.867 ], 00:13:51.867 "product_name": "Malloc disk", 00:13:51.867 "block_size": 512, 00:13:51.867 "num_blocks": 65536, 00:13:51.867 "uuid": "71a6f923-b9d9-42c3-a01b-c8e96ac6f3d6", 00:13:51.867 "assigned_rate_limits": { 00:13:51.867 "rw_ios_per_sec": 0, 00:13:51.867 "rw_mbytes_per_sec": 0, 00:13:51.867 "r_mbytes_per_sec": 0, 00:13:51.867 "w_mbytes_per_sec": 0 00:13:51.867 }, 00:13:51.867 "claimed": true, 00:13:51.867 "claim_type": "exclusive_write", 00:13:51.867 "zoned": false, 00:13:51.867 "supported_io_types": { 00:13:51.867 "read": true, 00:13:51.867 "write": true, 00:13:51.867 "unmap": true, 00:13:51.867 "write_zeroes": true, 00:13:51.867 "flush": true, 00:13:51.867 "reset": true, 00:13:51.867 "compare": false, 00:13:51.867 "compare_and_write": false, 00:13:51.867 "abort": true, 00:13:51.867 "nvme_admin": false, 00:13:51.867 "nvme_io": false 00:13:51.867 }, 00:13:51.867 "memory_domains": [ 00:13:51.867 { 00:13:51.867 "dma_device_id": "system", 00:13:51.867 "dma_device_type": 1 00:13:51.867 }, 00:13:51.867 { 00:13:51.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.867 "dma_device_type": 2 00:13:51.867 } 00:13:51.867 ], 00:13:51.867 "driver_specific": {} 00:13:51.867 }' 00:13:51.867 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.867 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.867 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:51.867 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:51.867 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.128 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:52.129 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.389 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.389 "name": "BaseBdev2", 00:13:52.389 "aliases": [ 00:13:52.389 "c3e405b3-0370-42a6-b40e-d1e9af5db7b1" 00:13:52.389 ], 00:13:52.389 "product_name": "Malloc disk", 00:13:52.389 "block_size": 512, 00:13:52.389 "num_blocks": 65536, 00:13:52.389 "uuid": "c3e405b3-0370-42a6-b40e-d1e9af5db7b1", 00:13:52.389 "assigned_rate_limits": { 00:13:52.389 "rw_ios_per_sec": 0, 00:13:52.389 "rw_mbytes_per_sec": 0, 00:13:52.389 "r_mbytes_per_sec": 0, 00:13:52.389 "w_mbytes_per_sec": 0 00:13:52.389 }, 00:13:52.389 "claimed": true, 00:13:52.389 "claim_type": "exclusive_write", 00:13:52.389 "zoned": false, 00:13:52.389 "supported_io_types": { 00:13:52.389 "read": true, 00:13:52.389 "write": true, 00:13:52.389 "unmap": true, 00:13:52.389 "write_zeroes": true, 00:13:52.389 "flush": true, 00:13:52.389 "reset": true, 00:13:52.389 "compare": false, 00:13:52.389 "compare_and_write": false, 00:13:52.389 "abort": true, 00:13:52.389 "nvme_admin": false, 00:13:52.389 "nvme_io": false 00:13:52.389 }, 00:13:52.389 "memory_domains": [ 00:13:52.389 { 00:13:52.389 "dma_device_id": "system", 00:13:52.389 "dma_device_type": 1 00:13:52.389 }, 00:13:52.389 { 00:13:52.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.389 "dma_device_type": 2 00:13:52.389 } 00:13:52.389 ], 00:13:52.389 "driver_specific": {} 00:13:52.389 }' 00:13:52.389 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.389 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.389 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.389 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.650 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.650 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.650 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.650 13:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:52.650 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.911 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.911 "name": "BaseBdev3", 00:13:52.911 "aliases": [ 00:13:52.911 "d9c974c7-6d6c-41d2-a226-620b819f0045" 00:13:52.911 ], 00:13:52.911 "product_name": "Malloc disk", 00:13:52.911 "block_size": 512, 00:13:52.911 "num_blocks": 65536, 00:13:52.911 "uuid": "d9c974c7-6d6c-41d2-a226-620b819f0045", 00:13:52.911 "assigned_rate_limits": { 00:13:52.911 "rw_ios_per_sec": 0, 00:13:52.911 "rw_mbytes_per_sec": 0, 00:13:52.911 "r_mbytes_per_sec": 0, 00:13:52.911 "w_mbytes_per_sec": 0 00:13:52.911 }, 00:13:52.911 "claimed": true, 00:13:52.911 "claim_type": "exclusive_write", 00:13:52.911 "zoned": false, 00:13:52.911 "supported_io_types": { 00:13:52.911 "read": true, 00:13:52.911 "write": true, 00:13:52.911 "unmap": true, 00:13:52.911 "write_zeroes": true, 00:13:52.911 "flush": true, 00:13:52.911 "reset": true, 00:13:52.911 "compare": false, 00:13:52.911 "compare_and_write": false, 00:13:52.911 "abort": true, 00:13:52.911 "nvme_admin": false, 00:13:52.911 "nvme_io": false 00:13:52.911 }, 00:13:52.911 "memory_domains": [ 00:13:52.911 { 00:13:52.911 "dma_device_id": "system", 00:13:52.911 "dma_device_type": 1 00:13:52.911 }, 00:13:52.911 { 00:13:52.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.911 "dma_device_type": 2 00:13:52.911 } 00:13:52.911 ], 00:13:52.911 "driver_specific": {} 00:13:52.911 }' 00:13:52.911 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.911 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.171 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:53.431 [2024-06-10 13:42:07.857716] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.431 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.432 13:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.691 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.691 "name": "Existed_Raid", 00:13:53.691 "uuid": "5a0c19cd-fc5f-4731-8561-2fbc4a8737dc", 00:13:53.691 "strip_size_kb": 0, 00:13:53.691 "state": "online", 00:13:53.691 "raid_level": "raid1", 00:13:53.691 "superblock": false, 00:13:53.691 "num_base_bdevs": 3, 00:13:53.691 "num_base_bdevs_discovered": 2, 00:13:53.691 "num_base_bdevs_operational": 2, 00:13:53.691 "base_bdevs_list": [ 00:13:53.691 { 00:13:53.691 "name": null, 00:13:53.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.691 "is_configured": false, 00:13:53.691 "data_offset": 0, 00:13:53.691 "data_size": 65536 00:13:53.691 }, 00:13:53.691 { 00:13:53.691 "name": "BaseBdev2", 00:13:53.691 "uuid": "c3e405b3-0370-42a6-b40e-d1e9af5db7b1", 00:13:53.691 "is_configured": true, 00:13:53.692 "data_offset": 0, 00:13:53.692 "data_size": 65536 00:13:53.692 }, 00:13:53.692 { 00:13:53.692 "name": "BaseBdev3", 00:13:53.692 "uuid": "d9c974c7-6d6c-41d2-a226-620b819f0045", 00:13:53.692 "is_configured": true, 00:13:53.692 "data_offset": 0, 00:13:53.692 "data_size": 65536 00:13:53.692 } 00:13:53.692 ] 00:13:53.692 }' 00:13:53.692 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.692 13:42:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.262 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:54.262 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.262 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.262 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:54.521 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:54.521 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:54.521 13:42:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:54.781 [2024-06-10 13:42:09.044728] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:54.781 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.781 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.781 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.781 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:55.040 [2024-06-10 13:42:09.451716] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:55.040 [2024-06-10 13:42:09.451778] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:55.040 [2024-06-10 13:42:09.457924] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:55.040 [2024-06-10 13:42:09.457949] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:55.040 [2024-06-10 13:42:09.457955] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe65f00 name Existed_Raid, state offline 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.040 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:55.300 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:55.300 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:55.300 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:55.300 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:55.300 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.300 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:55.561 BaseBdev2 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:55.561 13:42:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:55.821 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:55.821 [ 00:13:55.821 { 00:13:55.821 "name": "BaseBdev2", 00:13:55.821 "aliases": [ 00:13:55.821 "e770fc58-575c-46f1-8857-47c93edeeae8" 00:13:55.821 ], 00:13:55.821 "product_name": "Malloc disk", 00:13:55.821 "block_size": 512, 00:13:55.821 "num_blocks": 65536, 00:13:55.821 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:13:55.821 "assigned_rate_limits": { 00:13:55.821 "rw_ios_per_sec": 0, 00:13:55.821 "rw_mbytes_per_sec": 0, 00:13:55.822 "r_mbytes_per_sec": 0, 00:13:55.822 "w_mbytes_per_sec": 0 00:13:55.822 }, 00:13:55.822 "claimed": false, 00:13:55.822 "zoned": false, 00:13:55.822 "supported_io_types": { 00:13:55.822 "read": true, 00:13:55.822 "write": true, 00:13:55.822 "unmap": true, 00:13:55.822 "write_zeroes": true, 00:13:55.822 "flush": true, 00:13:55.822 "reset": true, 00:13:55.822 "compare": false, 00:13:55.822 "compare_and_write": false, 00:13:55.822 "abort": true, 00:13:55.822 "nvme_admin": false, 00:13:55.822 "nvme_io": false 00:13:55.822 }, 00:13:55.822 "memory_domains": [ 00:13:55.822 { 00:13:55.822 "dma_device_id": "system", 00:13:55.822 "dma_device_type": 1 00:13:55.822 }, 00:13:55.822 { 00:13:55.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.822 "dma_device_type": 2 00:13:55.822 } 00:13:55.822 ], 00:13:55.822 "driver_specific": {} 00:13:55.822 } 00:13:55.822 ] 00:13:55.822 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:55.822 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:55.822 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.822 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:56.082 BaseBdev3 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:56.082 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.342 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:56.603 [ 00:13:56.603 { 00:13:56.603 "name": "BaseBdev3", 00:13:56.603 "aliases": [ 00:13:56.603 "4ab32965-42c2-4a8e-b115-464d77e5bba9" 00:13:56.603 ], 00:13:56.603 "product_name": "Malloc disk", 00:13:56.603 "block_size": 512, 00:13:56.603 "num_blocks": 65536, 00:13:56.603 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:13:56.603 "assigned_rate_limits": { 00:13:56.603 "rw_ios_per_sec": 0, 00:13:56.603 "rw_mbytes_per_sec": 0, 00:13:56.603 "r_mbytes_per_sec": 0, 00:13:56.603 "w_mbytes_per_sec": 0 00:13:56.603 }, 00:13:56.603 "claimed": false, 00:13:56.603 "zoned": false, 00:13:56.603 "supported_io_types": { 00:13:56.603 "read": true, 00:13:56.603 "write": true, 00:13:56.603 "unmap": true, 00:13:56.603 "write_zeroes": true, 00:13:56.603 "flush": true, 00:13:56.603 "reset": true, 00:13:56.603 "compare": false, 00:13:56.603 "compare_and_write": false, 00:13:56.603 "abort": true, 00:13:56.603 "nvme_admin": false, 00:13:56.603 "nvme_io": false 00:13:56.603 }, 00:13:56.603 "memory_domains": [ 00:13:56.603 { 00:13:56.603 "dma_device_id": "system", 00:13:56.603 "dma_device_type": 1 00:13:56.603 }, 00:13:56.603 { 00:13:56.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.603 "dma_device_type": 2 00:13:56.603 } 00:13:56.603 ], 00:13:56.603 "driver_specific": {} 00:13:56.603 } 00:13:56.603 ] 00:13:56.603 13:42:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:56.603 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:56.603 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:56.603 13:42:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:56.603 [2024-06-10 13:42:11.051936] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:56.603 [2024-06-10 13:42:11.051966] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:56.603 [2024-06-10 13:42:11.051979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:56.603 [2024-06-10 13:42:11.053085] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.603 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.863 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.863 "name": "Existed_Raid", 00:13:56.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.863 "strip_size_kb": 0, 00:13:56.863 "state": "configuring", 00:13:56.863 "raid_level": "raid1", 00:13:56.863 "superblock": false, 00:13:56.863 "num_base_bdevs": 3, 00:13:56.863 "num_base_bdevs_discovered": 2, 00:13:56.863 "num_base_bdevs_operational": 3, 00:13:56.863 "base_bdevs_list": [ 00:13:56.863 { 00:13:56.863 "name": "BaseBdev1", 00:13:56.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.863 "is_configured": false, 00:13:56.863 "data_offset": 0, 00:13:56.863 "data_size": 0 00:13:56.863 }, 00:13:56.863 { 00:13:56.863 "name": "BaseBdev2", 00:13:56.863 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:13:56.863 "is_configured": true, 00:13:56.863 "data_offset": 0, 00:13:56.863 "data_size": 65536 00:13:56.863 }, 00:13:56.863 { 00:13:56.863 "name": "BaseBdev3", 00:13:56.863 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:13:56.863 "is_configured": true, 00:13:56.863 "data_offset": 0, 00:13:56.863 "data_size": 65536 00:13:56.863 } 00:13:56.863 ] 00:13:56.863 }' 00:13:56.863 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.863 13:42:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.432 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:57.692 [2024-06-10 13:42:11.954308] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.692 13:42:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.951 13:42:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.951 "name": "Existed_Raid", 00:13:57.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.951 "strip_size_kb": 0, 00:13:57.951 "state": "configuring", 00:13:57.951 "raid_level": "raid1", 00:13:57.951 "superblock": false, 00:13:57.951 "num_base_bdevs": 3, 00:13:57.951 "num_base_bdevs_discovered": 1, 00:13:57.951 "num_base_bdevs_operational": 3, 00:13:57.951 "base_bdevs_list": [ 00:13:57.951 { 00:13:57.951 "name": "BaseBdev1", 00:13:57.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.951 "is_configured": false, 00:13:57.951 "data_offset": 0, 00:13:57.951 "data_size": 0 00:13:57.951 }, 00:13:57.951 { 00:13:57.951 "name": null, 00:13:57.951 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:13:57.951 "is_configured": false, 00:13:57.951 "data_offset": 0, 00:13:57.951 "data_size": 65536 00:13:57.951 }, 00:13:57.951 { 00:13:57.951 "name": "BaseBdev3", 00:13:57.951 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:13:57.951 "is_configured": true, 00:13:57.951 "data_offset": 0, 00:13:57.951 "data_size": 65536 00:13:57.951 } 00:13:57.951 ] 00:13:57.951 }' 00:13:57.951 13:42:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.951 13:42:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.519 13:42:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.519 13:42:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:58.519 13:42:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:58.519 13:42:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:58.779 [2024-06-10 13:42:13.114348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:58.779 BaseBdev1 00:13:58.779 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:58.779 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:58.779 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:58.780 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:58.780 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:58.780 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:58.780 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:59.063 [ 00:13:59.063 { 00:13:59.063 "name": "BaseBdev1", 00:13:59.063 "aliases": [ 00:13:59.063 "3d9509c0-d826-4749-aea8-d08fd450973a" 00:13:59.063 ], 00:13:59.063 "product_name": "Malloc disk", 00:13:59.063 "block_size": 512, 00:13:59.063 "num_blocks": 65536, 00:13:59.063 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:13:59.063 "assigned_rate_limits": { 00:13:59.063 "rw_ios_per_sec": 0, 00:13:59.063 "rw_mbytes_per_sec": 0, 00:13:59.063 "r_mbytes_per_sec": 0, 00:13:59.063 "w_mbytes_per_sec": 0 00:13:59.063 }, 00:13:59.063 "claimed": true, 00:13:59.063 "claim_type": "exclusive_write", 00:13:59.063 "zoned": false, 00:13:59.063 "supported_io_types": { 00:13:59.063 "read": true, 00:13:59.063 "write": true, 00:13:59.063 "unmap": true, 00:13:59.063 "write_zeroes": true, 00:13:59.063 "flush": true, 00:13:59.063 "reset": true, 00:13:59.063 "compare": false, 00:13:59.063 "compare_and_write": false, 00:13:59.063 "abort": true, 00:13:59.063 "nvme_admin": false, 00:13:59.063 "nvme_io": false 00:13:59.063 }, 00:13:59.063 "memory_domains": [ 00:13:59.063 { 00:13:59.063 "dma_device_id": "system", 00:13:59.063 "dma_device_type": 1 00:13:59.063 }, 00:13:59.063 { 00:13:59.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.063 "dma_device_type": 2 00:13:59.063 } 00:13:59.063 ], 00:13:59.063 "driver_specific": {} 00:13:59.063 } 00:13:59.063 ] 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.063 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.345 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.345 "name": "Existed_Raid", 00:13:59.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.345 "strip_size_kb": 0, 00:13:59.345 "state": "configuring", 00:13:59.345 "raid_level": "raid1", 00:13:59.345 "superblock": false, 00:13:59.345 "num_base_bdevs": 3, 00:13:59.345 "num_base_bdevs_discovered": 2, 00:13:59.345 "num_base_bdevs_operational": 3, 00:13:59.345 "base_bdevs_list": [ 00:13:59.345 { 00:13:59.345 "name": "BaseBdev1", 00:13:59.345 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:13:59.345 "is_configured": true, 00:13:59.345 "data_offset": 0, 00:13:59.345 "data_size": 65536 00:13:59.345 }, 00:13:59.345 { 00:13:59.345 "name": null, 00:13:59.345 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:13:59.345 "is_configured": false, 00:13:59.345 "data_offset": 0, 00:13:59.345 "data_size": 65536 00:13:59.345 }, 00:13:59.345 { 00:13:59.345 "name": "BaseBdev3", 00:13:59.345 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:13:59.345 "is_configured": true, 00:13:59.345 "data_offset": 0, 00:13:59.345 "data_size": 65536 00:13:59.345 } 00:13:59.345 ] 00:13:59.345 }' 00:13:59.345 13:42:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.345 13:42:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.918 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.918 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:00.179 [2024-06-10 13:42:14.582084] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.179 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.439 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.439 "name": "Existed_Raid", 00:14:00.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.439 "strip_size_kb": 0, 00:14:00.439 "state": "configuring", 00:14:00.439 "raid_level": "raid1", 00:14:00.439 "superblock": false, 00:14:00.439 "num_base_bdevs": 3, 00:14:00.439 "num_base_bdevs_discovered": 1, 00:14:00.439 "num_base_bdevs_operational": 3, 00:14:00.439 "base_bdevs_list": [ 00:14:00.439 { 00:14:00.439 "name": "BaseBdev1", 00:14:00.439 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:00.439 "is_configured": true, 00:14:00.439 "data_offset": 0, 00:14:00.439 "data_size": 65536 00:14:00.439 }, 00:14:00.439 { 00:14:00.439 "name": null, 00:14:00.439 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:00.439 "is_configured": false, 00:14:00.439 "data_offset": 0, 00:14:00.439 "data_size": 65536 00:14:00.439 }, 00:14:00.439 { 00:14:00.439 "name": null, 00:14:00.439 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:00.439 "is_configured": false, 00:14:00.439 "data_offset": 0, 00:14:00.439 "data_size": 65536 00:14:00.439 } 00:14:00.439 ] 00:14:00.439 }' 00:14:00.439 13:42:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.439 13:42:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.008 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.008 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:01.268 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:01.268 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:01.528 [2024-06-10 13:42:15.753075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.528 "name": "Existed_Raid", 00:14:01.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.528 "strip_size_kb": 0, 00:14:01.528 "state": "configuring", 00:14:01.528 "raid_level": "raid1", 00:14:01.528 "superblock": false, 00:14:01.528 "num_base_bdevs": 3, 00:14:01.528 "num_base_bdevs_discovered": 2, 00:14:01.528 "num_base_bdevs_operational": 3, 00:14:01.528 "base_bdevs_list": [ 00:14:01.528 { 00:14:01.528 "name": "BaseBdev1", 00:14:01.528 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:01.528 "is_configured": true, 00:14:01.528 "data_offset": 0, 00:14:01.528 "data_size": 65536 00:14:01.528 }, 00:14:01.528 { 00:14:01.528 "name": null, 00:14:01.528 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:01.528 "is_configured": false, 00:14:01.528 "data_offset": 0, 00:14:01.528 "data_size": 65536 00:14:01.528 }, 00:14:01.528 { 00:14:01.528 "name": "BaseBdev3", 00:14:01.528 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:01.528 "is_configured": true, 00:14:01.528 "data_offset": 0, 00:14:01.528 "data_size": 65536 00:14:01.528 } 00:14:01.528 ] 00:14:01.528 }' 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.528 13:42:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.097 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.097 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:02.357 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:02.357 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:02.617 [2024-06-10 13:42:16.924068] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.617 13:42:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.877 13:42:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.877 "name": "Existed_Raid", 00:14:02.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.877 "strip_size_kb": 0, 00:14:02.877 "state": "configuring", 00:14:02.877 "raid_level": "raid1", 00:14:02.877 "superblock": false, 00:14:02.877 "num_base_bdevs": 3, 00:14:02.877 "num_base_bdevs_discovered": 1, 00:14:02.877 "num_base_bdevs_operational": 3, 00:14:02.877 "base_bdevs_list": [ 00:14:02.877 { 00:14:02.877 "name": null, 00:14:02.877 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:02.877 "is_configured": false, 00:14:02.877 "data_offset": 0, 00:14:02.877 "data_size": 65536 00:14:02.877 }, 00:14:02.877 { 00:14:02.877 "name": null, 00:14:02.877 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:02.877 "is_configured": false, 00:14:02.877 "data_offset": 0, 00:14:02.877 "data_size": 65536 00:14:02.877 }, 00:14:02.877 { 00:14:02.877 "name": "BaseBdev3", 00:14:02.877 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:02.877 "is_configured": true, 00:14:02.877 "data_offset": 0, 00:14:02.877 "data_size": 65536 00:14:02.877 } 00:14:02.877 ] 00:14:02.877 }' 00:14:02.877 13:42:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.877 13:42:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.448 13:42:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.448 13:42:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:03.448 13:42:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:03.448 13:42:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:03.709 [2024-06-10 13:42:18.092966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.709 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.710 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.710 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.710 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.710 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.970 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.970 "name": "Existed_Raid", 00:14:03.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.970 "strip_size_kb": 0, 00:14:03.970 "state": "configuring", 00:14:03.970 "raid_level": "raid1", 00:14:03.970 "superblock": false, 00:14:03.970 "num_base_bdevs": 3, 00:14:03.970 "num_base_bdevs_discovered": 2, 00:14:03.970 "num_base_bdevs_operational": 3, 00:14:03.970 "base_bdevs_list": [ 00:14:03.970 { 00:14:03.970 "name": null, 00:14:03.970 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:03.970 "is_configured": false, 00:14:03.970 "data_offset": 0, 00:14:03.970 "data_size": 65536 00:14:03.970 }, 00:14:03.970 { 00:14:03.970 "name": "BaseBdev2", 00:14:03.970 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:03.970 "is_configured": true, 00:14:03.970 "data_offset": 0, 00:14:03.970 "data_size": 65536 00:14:03.970 }, 00:14:03.970 { 00:14:03.970 "name": "BaseBdev3", 00:14:03.970 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:03.970 "is_configured": true, 00:14:03.970 "data_offset": 0, 00:14:03.970 "data_size": 65536 00:14:03.970 } 00:14:03.970 ] 00:14:03.970 }' 00:14:03.970 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.970 13:42:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.543 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.543 13:42:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:04.804 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:04.804 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.804 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:04.804 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3d9509c0-d826-4749-aea8-d08fd450973a 00:14:05.065 [2024-06-10 13:42:19.429521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:05.065 [2024-06-10 13:42:19.429544] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe65b50 00:14:05.065 [2024-06-10 13:42:19.429549] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:05.065 [2024-06-10 13:42:19.429711] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x100b070 00:14:05.065 [2024-06-10 13:42:19.429816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe65b50 00:14:05.065 [2024-06-10 13:42:19.429823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe65b50 00:14:05.065 [2024-06-10 13:42:19.429946] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:05.065 NewBaseBdev 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:05.065 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:05.325 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:05.585 [ 00:14:05.585 { 00:14:05.585 "name": "NewBaseBdev", 00:14:05.585 "aliases": [ 00:14:05.585 "3d9509c0-d826-4749-aea8-d08fd450973a" 00:14:05.585 ], 00:14:05.585 "product_name": "Malloc disk", 00:14:05.585 "block_size": 512, 00:14:05.585 "num_blocks": 65536, 00:14:05.585 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:05.585 "assigned_rate_limits": { 00:14:05.585 "rw_ios_per_sec": 0, 00:14:05.585 "rw_mbytes_per_sec": 0, 00:14:05.585 "r_mbytes_per_sec": 0, 00:14:05.585 "w_mbytes_per_sec": 0 00:14:05.585 }, 00:14:05.585 "claimed": true, 00:14:05.585 "claim_type": "exclusive_write", 00:14:05.585 "zoned": false, 00:14:05.585 "supported_io_types": { 00:14:05.585 "read": true, 00:14:05.585 "write": true, 00:14:05.585 "unmap": true, 00:14:05.585 "write_zeroes": true, 00:14:05.585 "flush": true, 00:14:05.585 "reset": true, 00:14:05.585 "compare": false, 00:14:05.585 "compare_and_write": false, 00:14:05.585 "abort": true, 00:14:05.585 "nvme_admin": false, 00:14:05.585 "nvme_io": false 00:14:05.585 }, 00:14:05.585 "memory_domains": [ 00:14:05.585 { 00:14:05.585 "dma_device_id": "system", 00:14:05.585 "dma_device_type": 1 00:14:05.585 }, 00:14:05.585 { 00:14:05.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.585 "dma_device_type": 2 00:14:05.585 } 00:14:05.585 ], 00:14:05.585 "driver_specific": {} 00:14:05.585 } 00:14:05.585 ] 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.585 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.586 13:42:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.586 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.586 "name": "Existed_Raid", 00:14:05.586 "uuid": "f3fd935c-0e70-4e89-a493-13341a674dbd", 00:14:05.586 "strip_size_kb": 0, 00:14:05.586 "state": "online", 00:14:05.586 "raid_level": "raid1", 00:14:05.586 "superblock": false, 00:14:05.586 "num_base_bdevs": 3, 00:14:05.586 "num_base_bdevs_discovered": 3, 00:14:05.586 "num_base_bdevs_operational": 3, 00:14:05.586 "base_bdevs_list": [ 00:14:05.586 { 00:14:05.586 "name": "NewBaseBdev", 00:14:05.586 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:05.586 "is_configured": true, 00:14:05.586 "data_offset": 0, 00:14:05.586 "data_size": 65536 00:14:05.586 }, 00:14:05.586 { 00:14:05.586 "name": "BaseBdev2", 00:14:05.586 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:05.586 "is_configured": true, 00:14:05.586 "data_offset": 0, 00:14:05.586 "data_size": 65536 00:14:05.586 }, 00:14:05.586 { 00:14:05.586 "name": "BaseBdev3", 00:14:05.586 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:05.586 "is_configured": true, 00:14:05.586 "data_offset": 0, 00:14:05.586 "data_size": 65536 00:14:05.586 } 00:14:05.586 ] 00:14:05.586 }' 00:14:05.586 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.586 13:42:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:06.158 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:06.419 [2024-06-10 13:42:20.797235] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:06.419 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:06.419 "name": "Existed_Raid", 00:14:06.419 "aliases": [ 00:14:06.419 "f3fd935c-0e70-4e89-a493-13341a674dbd" 00:14:06.419 ], 00:14:06.419 "product_name": "Raid Volume", 00:14:06.419 "block_size": 512, 00:14:06.419 "num_blocks": 65536, 00:14:06.419 "uuid": "f3fd935c-0e70-4e89-a493-13341a674dbd", 00:14:06.419 "assigned_rate_limits": { 00:14:06.419 "rw_ios_per_sec": 0, 00:14:06.419 "rw_mbytes_per_sec": 0, 00:14:06.419 "r_mbytes_per_sec": 0, 00:14:06.419 "w_mbytes_per_sec": 0 00:14:06.419 }, 00:14:06.419 "claimed": false, 00:14:06.419 "zoned": false, 00:14:06.419 "supported_io_types": { 00:14:06.419 "read": true, 00:14:06.419 "write": true, 00:14:06.419 "unmap": false, 00:14:06.419 "write_zeroes": true, 00:14:06.419 "flush": false, 00:14:06.419 "reset": true, 00:14:06.419 "compare": false, 00:14:06.419 "compare_and_write": false, 00:14:06.419 "abort": false, 00:14:06.419 "nvme_admin": false, 00:14:06.419 "nvme_io": false 00:14:06.419 }, 00:14:06.419 "memory_domains": [ 00:14:06.419 { 00:14:06.419 "dma_device_id": "system", 00:14:06.419 "dma_device_type": 1 00:14:06.419 }, 00:14:06.419 { 00:14:06.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.419 "dma_device_type": 2 00:14:06.419 }, 00:14:06.419 { 00:14:06.419 "dma_device_id": "system", 00:14:06.419 "dma_device_type": 1 00:14:06.419 }, 00:14:06.419 { 00:14:06.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.420 "dma_device_type": 2 00:14:06.420 }, 00:14:06.420 { 00:14:06.420 "dma_device_id": "system", 00:14:06.420 "dma_device_type": 1 00:14:06.420 }, 00:14:06.420 { 00:14:06.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.420 "dma_device_type": 2 00:14:06.420 } 00:14:06.420 ], 00:14:06.420 "driver_specific": { 00:14:06.420 "raid": { 00:14:06.420 "uuid": "f3fd935c-0e70-4e89-a493-13341a674dbd", 00:14:06.420 "strip_size_kb": 0, 00:14:06.420 "state": "online", 00:14:06.420 "raid_level": "raid1", 00:14:06.420 "superblock": false, 00:14:06.420 "num_base_bdevs": 3, 00:14:06.420 "num_base_bdevs_discovered": 3, 00:14:06.420 "num_base_bdevs_operational": 3, 00:14:06.420 "base_bdevs_list": [ 00:14:06.420 { 00:14:06.420 "name": "NewBaseBdev", 00:14:06.420 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:06.420 "is_configured": true, 00:14:06.420 "data_offset": 0, 00:14:06.420 "data_size": 65536 00:14:06.420 }, 00:14:06.420 { 00:14:06.420 "name": "BaseBdev2", 00:14:06.420 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:06.420 "is_configured": true, 00:14:06.420 "data_offset": 0, 00:14:06.420 "data_size": 65536 00:14:06.420 }, 00:14:06.420 { 00:14:06.420 "name": "BaseBdev3", 00:14:06.420 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:06.420 "is_configured": true, 00:14:06.420 "data_offset": 0, 00:14:06.420 "data_size": 65536 00:14:06.420 } 00:14:06.420 ] 00:14:06.420 } 00:14:06.420 } 00:14:06.420 }' 00:14:06.420 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:06.420 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:06.420 BaseBdev2 00:14:06.420 BaseBdev3' 00:14:06.420 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.420 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:06.420 13:42:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.681 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.681 "name": "NewBaseBdev", 00:14:06.681 "aliases": [ 00:14:06.681 "3d9509c0-d826-4749-aea8-d08fd450973a" 00:14:06.681 ], 00:14:06.681 "product_name": "Malloc disk", 00:14:06.681 "block_size": 512, 00:14:06.681 "num_blocks": 65536, 00:14:06.681 "uuid": "3d9509c0-d826-4749-aea8-d08fd450973a", 00:14:06.681 "assigned_rate_limits": { 00:14:06.681 "rw_ios_per_sec": 0, 00:14:06.681 "rw_mbytes_per_sec": 0, 00:14:06.681 "r_mbytes_per_sec": 0, 00:14:06.681 "w_mbytes_per_sec": 0 00:14:06.681 }, 00:14:06.681 "claimed": true, 00:14:06.681 "claim_type": "exclusive_write", 00:14:06.681 "zoned": false, 00:14:06.681 "supported_io_types": { 00:14:06.681 "read": true, 00:14:06.681 "write": true, 00:14:06.681 "unmap": true, 00:14:06.681 "write_zeroes": true, 00:14:06.681 "flush": true, 00:14:06.681 "reset": true, 00:14:06.681 "compare": false, 00:14:06.681 "compare_and_write": false, 00:14:06.681 "abort": true, 00:14:06.681 "nvme_admin": false, 00:14:06.681 "nvme_io": false 00:14:06.681 }, 00:14:06.681 "memory_domains": [ 00:14:06.681 { 00:14:06.681 "dma_device_id": "system", 00:14:06.681 "dma_device_type": 1 00:14:06.681 }, 00:14:06.681 { 00:14:06.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.681 "dma_device_type": 2 00:14:06.681 } 00:14:06.681 ], 00:14:06.681 "driver_specific": {} 00:14:06.681 }' 00:14:06.681 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.681 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.681 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.681 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.942 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.203 "name": "BaseBdev2", 00:14:07.203 "aliases": [ 00:14:07.203 "e770fc58-575c-46f1-8857-47c93edeeae8" 00:14:07.203 ], 00:14:07.203 "product_name": "Malloc disk", 00:14:07.203 "block_size": 512, 00:14:07.203 "num_blocks": 65536, 00:14:07.203 "uuid": "e770fc58-575c-46f1-8857-47c93edeeae8", 00:14:07.203 "assigned_rate_limits": { 00:14:07.203 "rw_ios_per_sec": 0, 00:14:07.203 "rw_mbytes_per_sec": 0, 00:14:07.203 "r_mbytes_per_sec": 0, 00:14:07.203 "w_mbytes_per_sec": 0 00:14:07.203 }, 00:14:07.203 "claimed": true, 00:14:07.203 "claim_type": "exclusive_write", 00:14:07.203 "zoned": false, 00:14:07.203 "supported_io_types": { 00:14:07.203 "read": true, 00:14:07.203 "write": true, 00:14:07.203 "unmap": true, 00:14:07.203 "write_zeroes": true, 00:14:07.203 "flush": true, 00:14:07.203 "reset": true, 00:14:07.203 "compare": false, 00:14:07.203 "compare_and_write": false, 00:14:07.203 "abort": true, 00:14:07.203 "nvme_admin": false, 00:14:07.203 "nvme_io": false 00:14:07.203 }, 00:14:07.203 "memory_domains": [ 00:14:07.203 { 00:14:07.203 "dma_device_id": "system", 00:14:07.203 "dma_device_type": 1 00:14:07.203 }, 00:14:07.203 { 00:14:07.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.203 "dma_device_type": 2 00:14:07.203 } 00:14:07.203 ], 00:14:07.203 "driver_specific": {} 00:14:07.203 }' 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.203 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.464 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.725 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.725 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.725 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:07.725 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:07.725 13:42:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.725 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.725 "name": "BaseBdev3", 00:14:07.725 "aliases": [ 00:14:07.725 "4ab32965-42c2-4a8e-b115-464d77e5bba9" 00:14:07.725 ], 00:14:07.725 "product_name": "Malloc disk", 00:14:07.725 "block_size": 512, 00:14:07.725 "num_blocks": 65536, 00:14:07.725 "uuid": "4ab32965-42c2-4a8e-b115-464d77e5bba9", 00:14:07.725 "assigned_rate_limits": { 00:14:07.725 "rw_ios_per_sec": 0, 00:14:07.725 "rw_mbytes_per_sec": 0, 00:14:07.725 "r_mbytes_per_sec": 0, 00:14:07.725 "w_mbytes_per_sec": 0 00:14:07.725 }, 00:14:07.725 "claimed": true, 00:14:07.725 "claim_type": "exclusive_write", 00:14:07.725 "zoned": false, 00:14:07.725 "supported_io_types": { 00:14:07.725 "read": true, 00:14:07.725 "write": true, 00:14:07.725 "unmap": true, 00:14:07.725 "write_zeroes": true, 00:14:07.725 "flush": true, 00:14:07.725 "reset": true, 00:14:07.725 "compare": false, 00:14:07.725 "compare_and_write": false, 00:14:07.725 "abort": true, 00:14:07.725 "nvme_admin": false, 00:14:07.725 "nvme_io": false 00:14:07.725 }, 00:14:07.725 "memory_domains": [ 00:14:07.725 { 00:14:07.725 "dma_device_id": "system", 00:14:07.725 "dma_device_type": 1 00:14:07.725 }, 00:14:07.725 { 00:14:07.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.725 "dma_device_type": 2 00:14:07.725 } 00:14:07.725 ], 00:14:07.725 "driver_specific": {} 00:14:07.725 }' 00:14:07.725 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.985 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.245 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.245 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.245 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:08.245 [2024-06-10 13:42:22.717879] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:08.245 [2024-06-10 13:42:22.717895] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:08.245 [2024-06-10 13:42:22.717934] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:08.245 [2024-06-10 13:42:22.718156] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:08.245 [2024-06-10 13:42:22.718168] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe65b50 name Existed_Raid, state offline 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1545301 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1545301 ']' 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1545301 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1545301 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1545301' 00:14:08.506 killing process with pid 1545301 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1545301 00:14:08.506 [2024-06-10 13:42:22.785156] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1545301 00:14:08.506 [2024-06-10 13:42:22.800312] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:08.506 00:14:08.506 real 0m24.581s 00:14:08.506 user 0m46.090s 00:14:08.506 sys 0m3.557s 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:08.506 13:42:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.506 ************************************ 00:14:08.506 END TEST raid_state_function_test 00:14:08.506 ************************************ 00:14:08.506 13:42:22 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:14:08.506 13:42:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:08.506 13:42:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:08.506 13:42:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:08.767 ************************************ 00:14:08.767 START TEST raid_state_function_test_sb 00:14:08.767 ************************************ 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 true 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:08.767 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1551131 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1551131' 00:14:08.768 Process raid pid: 1551131 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1551131 /var/tmp/spdk-raid.sock 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1551131 ']' 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:08.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:08.768 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.768 [2024-06-10 13:42:23.066332] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:14:08.768 [2024-06-10 13:42:23.066380] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:08.768 [2024-06-10 13:42:23.155248] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:08.768 [2024-06-10 13:42:23.222605] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:14:09.028 [2024-06-10 13:42:23.268649] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:09.028 [2024-06-10 13:42:23.268672] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:09.599 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:09.599 13:42:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:14:09.599 13:42:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:09.860 [2024-06-10 13:42:24.124617] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:09.860 [2024-06-10 13:42:24.124646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:09.860 [2024-06-10 13:42:24.124652] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:09.860 [2024-06-10 13:42:24.124658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:09.860 [2024-06-10 13:42:24.124663] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:09.860 [2024-06-10 13:42:24.124669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.860 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.122 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.122 "name": "Existed_Raid", 00:14:10.122 "uuid": "db07212f-cb35-4060-91f2-e48cf23f1e1a", 00:14:10.122 "strip_size_kb": 0, 00:14:10.122 "state": "configuring", 00:14:10.122 "raid_level": "raid1", 00:14:10.122 "superblock": true, 00:14:10.122 "num_base_bdevs": 3, 00:14:10.122 "num_base_bdevs_discovered": 0, 00:14:10.122 "num_base_bdevs_operational": 3, 00:14:10.122 "base_bdevs_list": [ 00:14:10.122 { 00:14:10.122 "name": "BaseBdev1", 00:14:10.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.122 "is_configured": false, 00:14:10.122 "data_offset": 0, 00:14:10.122 "data_size": 0 00:14:10.122 }, 00:14:10.122 { 00:14:10.122 "name": "BaseBdev2", 00:14:10.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.122 "is_configured": false, 00:14:10.122 "data_offset": 0, 00:14:10.122 "data_size": 0 00:14:10.122 }, 00:14:10.122 { 00:14:10.122 "name": "BaseBdev3", 00:14:10.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.122 "is_configured": false, 00:14:10.122 "data_offset": 0, 00:14:10.122 "data_size": 0 00:14:10.122 } 00:14:10.122 ] 00:14:10.122 }' 00:14:10.122 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.122 13:42:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:10.694 13:42:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:10.694 [2024-06-10 13:42:25.074895] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:10.694 [2024-06-10 13:42:25.074916] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7c8740 name Existed_Raid, state configuring 00:14:10.694 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:10.954 [2024-06-10 13:42:25.279429] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:10.954 [2024-06-10 13:42:25.279445] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:10.954 [2024-06-10 13:42:25.279451] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:10.954 [2024-06-10 13:42:25.279457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:10.954 [2024-06-10 13:42:25.279461] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:10.954 [2024-06-10 13:42:25.279467] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:10.954 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:11.215 [2024-06-10 13:42:25.490925] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:11.215 BaseBdev1 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:11.215 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:11.475 [ 00:14:11.475 { 00:14:11.475 "name": "BaseBdev1", 00:14:11.475 "aliases": [ 00:14:11.475 "d6e878f6-6025-4b98-9855-184e28b76560" 00:14:11.475 ], 00:14:11.475 "product_name": "Malloc disk", 00:14:11.475 "block_size": 512, 00:14:11.475 "num_blocks": 65536, 00:14:11.475 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:11.475 "assigned_rate_limits": { 00:14:11.475 "rw_ios_per_sec": 0, 00:14:11.475 "rw_mbytes_per_sec": 0, 00:14:11.475 "r_mbytes_per_sec": 0, 00:14:11.475 "w_mbytes_per_sec": 0 00:14:11.475 }, 00:14:11.475 "claimed": true, 00:14:11.475 "claim_type": "exclusive_write", 00:14:11.475 "zoned": false, 00:14:11.475 "supported_io_types": { 00:14:11.475 "read": true, 00:14:11.475 "write": true, 00:14:11.475 "unmap": true, 00:14:11.475 "write_zeroes": true, 00:14:11.475 "flush": true, 00:14:11.475 "reset": true, 00:14:11.475 "compare": false, 00:14:11.475 "compare_and_write": false, 00:14:11.475 "abort": true, 00:14:11.475 "nvme_admin": false, 00:14:11.475 "nvme_io": false 00:14:11.475 }, 00:14:11.475 "memory_domains": [ 00:14:11.475 { 00:14:11.475 "dma_device_id": "system", 00:14:11.475 "dma_device_type": 1 00:14:11.475 }, 00:14:11.475 { 00:14:11.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.475 "dma_device_type": 2 00:14:11.475 } 00:14:11.475 ], 00:14:11.475 "driver_specific": {} 00:14:11.475 } 00:14:11.475 ] 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.475 13:42:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.736 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.736 "name": "Existed_Raid", 00:14:11.736 "uuid": "1803c8a5-3ffc-44e5-a13d-07b913b8e9b5", 00:14:11.736 "strip_size_kb": 0, 00:14:11.736 "state": "configuring", 00:14:11.736 "raid_level": "raid1", 00:14:11.736 "superblock": true, 00:14:11.736 "num_base_bdevs": 3, 00:14:11.736 "num_base_bdevs_discovered": 1, 00:14:11.736 "num_base_bdevs_operational": 3, 00:14:11.736 "base_bdevs_list": [ 00:14:11.736 { 00:14:11.736 "name": "BaseBdev1", 00:14:11.736 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:11.736 "is_configured": true, 00:14:11.736 "data_offset": 2048, 00:14:11.736 "data_size": 63488 00:14:11.736 }, 00:14:11.736 { 00:14:11.736 "name": "BaseBdev2", 00:14:11.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.736 "is_configured": false, 00:14:11.736 "data_offset": 0, 00:14:11.736 "data_size": 0 00:14:11.736 }, 00:14:11.736 { 00:14:11.736 "name": "BaseBdev3", 00:14:11.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.736 "is_configured": false, 00:14:11.736 "data_offset": 0, 00:14:11.736 "data_size": 0 00:14:11.736 } 00:14:11.736 ] 00:14:11.736 }' 00:14:11.736 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.736 13:42:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:12.307 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:12.568 [2024-06-10 13:42:26.850399] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:12.568 [2024-06-10 13:42:26.850426] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7c8010 name Existed_Raid, state configuring 00:14:12.569 13:42:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:12.830 [2024-06-10 13:42:27.054948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:12.830 [2024-06-10 13:42:27.056160] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:12.830 [2024-06-10 13:42:27.056191] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:12.830 [2024-06-10 13:42:27.056197] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:12.830 [2024-06-10 13:42:27.056203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.830 "name": "Existed_Raid", 00:14:12.830 "uuid": "b9837d22-2ffe-4acb-8f10-d52a63388327", 00:14:12.830 "strip_size_kb": 0, 00:14:12.830 "state": "configuring", 00:14:12.830 "raid_level": "raid1", 00:14:12.830 "superblock": true, 00:14:12.830 "num_base_bdevs": 3, 00:14:12.830 "num_base_bdevs_discovered": 1, 00:14:12.830 "num_base_bdevs_operational": 3, 00:14:12.830 "base_bdevs_list": [ 00:14:12.830 { 00:14:12.830 "name": "BaseBdev1", 00:14:12.830 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:12.830 "is_configured": true, 00:14:12.830 "data_offset": 2048, 00:14:12.830 "data_size": 63488 00:14:12.830 }, 00:14:12.830 { 00:14:12.830 "name": "BaseBdev2", 00:14:12.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.830 "is_configured": false, 00:14:12.830 "data_offset": 0, 00:14:12.830 "data_size": 0 00:14:12.830 }, 00:14:12.830 { 00:14:12.830 "name": "BaseBdev3", 00:14:12.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.830 "is_configured": false, 00:14:12.830 "data_offset": 0, 00:14:12.830 "data_size": 0 00:14:12.830 } 00:14:12.830 ] 00:14:12.830 }' 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.830 13:42:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.402 13:42:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:13.662 [2024-06-10 13:42:28.042260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:13.662 BaseBdev2 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:13.662 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:13.924 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:14.186 [ 00:14:14.186 { 00:14:14.186 "name": "BaseBdev2", 00:14:14.186 "aliases": [ 00:14:14.186 "ab7f1a5c-d392-48e3-9c2b-93a78df423f8" 00:14:14.186 ], 00:14:14.186 "product_name": "Malloc disk", 00:14:14.186 "block_size": 512, 00:14:14.186 "num_blocks": 65536, 00:14:14.186 "uuid": "ab7f1a5c-d392-48e3-9c2b-93a78df423f8", 00:14:14.186 "assigned_rate_limits": { 00:14:14.186 "rw_ios_per_sec": 0, 00:14:14.186 "rw_mbytes_per_sec": 0, 00:14:14.186 "r_mbytes_per_sec": 0, 00:14:14.186 "w_mbytes_per_sec": 0 00:14:14.186 }, 00:14:14.186 "claimed": true, 00:14:14.186 "claim_type": "exclusive_write", 00:14:14.186 "zoned": false, 00:14:14.186 "supported_io_types": { 00:14:14.186 "read": true, 00:14:14.186 "write": true, 00:14:14.186 "unmap": true, 00:14:14.186 "write_zeroes": true, 00:14:14.186 "flush": true, 00:14:14.186 "reset": true, 00:14:14.186 "compare": false, 00:14:14.186 "compare_and_write": false, 00:14:14.186 "abort": true, 00:14:14.186 "nvme_admin": false, 00:14:14.186 "nvme_io": false 00:14:14.186 }, 00:14:14.186 "memory_domains": [ 00:14:14.186 { 00:14:14.186 "dma_device_id": "system", 00:14:14.186 "dma_device_type": 1 00:14:14.186 }, 00:14:14.186 { 00:14:14.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.186 "dma_device_type": 2 00:14:14.186 } 00:14:14.186 ], 00:14:14.186 "driver_specific": {} 00:14:14.186 } 00:14:14.186 ] 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.186 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:14.447 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.447 "name": "Existed_Raid", 00:14:14.447 "uuid": "b9837d22-2ffe-4acb-8f10-d52a63388327", 00:14:14.447 "strip_size_kb": 0, 00:14:14.447 "state": "configuring", 00:14:14.447 "raid_level": "raid1", 00:14:14.447 "superblock": true, 00:14:14.447 "num_base_bdevs": 3, 00:14:14.447 "num_base_bdevs_discovered": 2, 00:14:14.447 "num_base_bdevs_operational": 3, 00:14:14.447 "base_bdevs_list": [ 00:14:14.447 { 00:14:14.447 "name": "BaseBdev1", 00:14:14.447 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:14.447 "is_configured": true, 00:14:14.447 "data_offset": 2048, 00:14:14.447 "data_size": 63488 00:14:14.447 }, 00:14:14.447 { 00:14:14.447 "name": "BaseBdev2", 00:14:14.447 "uuid": "ab7f1a5c-d392-48e3-9c2b-93a78df423f8", 00:14:14.447 "is_configured": true, 00:14:14.447 "data_offset": 2048, 00:14:14.447 "data_size": 63488 00:14:14.447 }, 00:14:14.447 { 00:14:14.447 "name": "BaseBdev3", 00:14:14.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:14.447 "is_configured": false, 00:14:14.447 "data_offset": 0, 00:14:14.447 "data_size": 0 00:14:14.447 } 00:14:14.447 ] 00:14:14.447 }' 00:14:14.447 13:42:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.447 13:42:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:15.018 [2024-06-10 13:42:29.402708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:15.018 [2024-06-10 13:42:29.402829] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7c8f00 00:14:15.018 [2024-06-10 13:42:29.402837] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:15.018 [2024-06-10 13:42:29.402982] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7dfdf0 00:14:15.018 [2024-06-10 13:42:29.403079] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7c8f00 00:14:15.018 [2024-06-10 13:42:29.403085] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7c8f00 00:14:15.018 [2024-06-10 13:42:29.403158] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:15.018 BaseBdev3 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:15.018 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:15.279 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:15.540 [ 00:14:15.540 { 00:14:15.540 "name": "BaseBdev3", 00:14:15.540 "aliases": [ 00:14:15.540 "139bade1-aa7e-48d2-95c9-ce18462d7249" 00:14:15.540 ], 00:14:15.540 "product_name": "Malloc disk", 00:14:15.540 "block_size": 512, 00:14:15.540 "num_blocks": 65536, 00:14:15.540 "uuid": "139bade1-aa7e-48d2-95c9-ce18462d7249", 00:14:15.540 "assigned_rate_limits": { 00:14:15.540 "rw_ios_per_sec": 0, 00:14:15.540 "rw_mbytes_per_sec": 0, 00:14:15.540 "r_mbytes_per_sec": 0, 00:14:15.540 "w_mbytes_per_sec": 0 00:14:15.540 }, 00:14:15.540 "claimed": true, 00:14:15.540 "claim_type": "exclusive_write", 00:14:15.540 "zoned": false, 00:14:15.540 "supported_io_types": { 00:14:15.540 "read": true, 00:14:15.540 "write": true, 00:14:15.540 "unmap": true, 00:14:15.540 "write_zeroes": true, 00:14:15.540 "flush": true, 00:14:15.540 "reset": true, 00:14:15.540 "compare": false, 00:14:15.540 "compare_and_write": false, 00:14:15.540 "abort": true, 00:14:15.540 "nvme_admin": false, 00:14:15.540 "nvme_io": false 00:14:15.540 }, 00:14:15.540 "memory_domains": [ 00:14:15.540 { 00:14:15.540 "dma_device_id": "system", 00:14:15.540 "dma_device_type": 1 00:14:15.540 }, 00:14:15.540 { 00:14:15.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.540 "dma_device_type": 2 00:14:15.540 } 00:14:15.540 ], 00:14:15.540 "driver_specific": {} 00:14:15.540 } 00:14:15.540 ] 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.540 13:42:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.540 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.540 "name": "Existed_Raid", 00:14:15.540 "uuid": "b9837d22-2ffe-4acb-8f10-d52a63388327", 00:14:15.540 "strip_size_kb": 0, 00:14:15.540 "state": "online", 00:14:15.540 "raid_level": "raid1", 00:14:15.540 "superblock": true, 00:14:15.540 "num_base_bdevs": 3, 00:14:15.540 "num_base_bdevs_discovered": 3, 00:14:15.540 "num_base_bdevs_operational": 3, 00:14:15.540 "base_bdevs_list": [ 00:14:15.540 { 00:14:15.540 "name": "BaseBdev1", 00:14:15.540 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:15.540 "is_configured": true, 00:14:15.540 "data_offset": 2048, 00:14:15.540 "data_size": 63488 00:14:15.540 }, 00:14:15.540 { 00:14:15.540 "name": "BaseBdev2", 00:14:15.540 "uuid": "ab7f1a5c-d392-48e3-9c2b-93a78df423f8", 00:14:15.540 "is_configured": true, 00:14:15.540 "data_offset": 2048, 00:14:15.540 "data_size": 63488 00:14:15.540 }, 00:14:15.540 { 00:14:15.540 "name": "BaseBdev3", 00:14:15.540 "uuid": "139bade1-aa7e-48d2-95c9-ce18462d7249", 00:14:15.540 "is_configured": true, 00:14:15.540 "data_offset": 2048, 00:14:15.540 "data_size": 63488 00:14:15.540 } 00:14:15.540 ] 00:14:15.540 }' 00:14:15.540 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.540 13:42:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:16.112 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:16.373 [2024-06-10 13:42:30.770495] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:16.373 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:16.373 "name": "Existed_Raid", 00:14:16.373 "aliases": [ 00:14:16.373 "b9837d22-2ffe-4acb-8f10-d52a63388327" 00:14:16.373 ], 00:14:16.373 "product_name": "Raid Volume", 00:14:16.373 "block_size": 512, 00:14:16.373 "num_blocks": 63488, 00:14:16.373 "uuid": "b9837d22-2ffe-4acb-8f10-d52a63388327", 00:14:16.373 "assigned_rate_limits": { 00:14:16.373 "rw_ios_per_sec": 0, 00:14:16.373 "rw_mbytes_per_sec": 0, 00:14:16.373 "r_mbytes_per_sec": 0, 00:14:16.373 "w_mbytes_per_sec": 0 00:14:16.373 }, 00:14:16.373 "claimed": false, 00:14:16.373 "zoned": false, 00:14:16.373 "supported_io_types": { 00:14:16.373 "read": true, 00:14:16.373 "write": true, 00:14:16.373 "unmap": false, 00:14:16.373 "write_zeroes": true, 00:14:16.373 "flush": false, 00:14:16.373 "reset": true, 00:14:16.373 "compare": false, 00:14:16.373 "compare_and_write": false, 00:14:16.373 "abort": false, 00:14:16.373 "nvme_admin": false, 00:14:16.373 "nvme_io": false 00:14:16.373 }, 00:14:16.373 "memory_domains": [ 00:14:16.373 { 00:14:16.373 "dma_device_id": "system", 00:14:16.373 "dma_device_type": 1 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.373 "dma_device_type": 2 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "dma_device_id": "system", 00:14:16.373 "dma_device_type": 1 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.373 "dma_device_type": 2 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "dma_device_id": "system", 00:14:16.373 "dma_device_type": 1 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.373 "dma_device_type": 2 00:14:16.373 } 00:14:16.373 ], 00:14:16.373 "driver_specific": { 00:14:16.373 "raid": { 00:14:16.373 "uuid": "b9837d22-2ffe-4acb-8f10-d52a63388327", 00:14:16.373 "strip_size_kb": 0, 00:14:16.373 "state": "online", 00:14:16.373 "raid_level": "raid1", 00:14:16.373 "superblock": true, 00:14:16.373 "num_base_bdevs": 3, 00:14:16.373 "num_base_bdevs_discovered": 3, 00:14:16.373 "num_base_bdevs_operational": 3, 00:14:16.373 "base_bdevs_list": [ 00:14:16.373 { 00:14:16.373 "name": "BaseBdev1", 00:14:16.373 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:16.373 "is_configured": true, 00:14:16.373 "data_offset": 2048, 00:14:16.373 "data_size": 63488 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "name": "BaseBdev2", 00:14:16.373 "uuid": "ab7f1a5c-d392-48e3-9c2b-93a78df423f8", 00:14:16.373 "is_configured": true, 00:14:16.373 "data_offset": 2048, 00:14:16.373 "data_size": 63488 00:14:16.373 }, 00:14:16.373 { 00:14:16.373 "name": "BaseBdev3", 00:14:16.373 "uuid": "139bade1-aa7e-48d2-95c9-ce18462d7249", 00:14:16.373 "is_configured": true, 00:14:16.373 "data_offset": 2048, 00:14:16.373 "data_size": 63488 00:14:16.373 } 00:14:16.373 ] 00:14:16.373 } 00:14:16.373 } 00:14:16.373 }' 00:14:16.373 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:16.373 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:16.373 BaseBdev2 00:14:16.373 BaseBdev3' 00:14:16.373 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.373 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:16.373 13:42:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:16.634 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:16.634 "name": "BaseBdev1", 00:14:16.634 "aliases": [ 00:14:16.634 "d6e878f6-6025-4b98-9855-184e28b76560" 00:14:16.634 ], 00:14:16.634 "product_name": "Malloc disk", 00:14:16.634 "block_size": 512, 00:14:16.634 "num_blocks": 65536, 00:14:16.634 "uuid": "d6e878f6-6025-4b98-9855-184e28b76560", 00:14:16.634 "assigned_rate_limits": { 00:14:16.634 "rw_ios_per_sec": 0, 00:14:16.634 "rw_mbytes_per_sec": 0, 00:14:16.634 "r_mbytes_per_sec": 0, 00:14:16.634 "w_mbytes_per_sec": 0 00:14:16.634 }, 00:14:16.634 "claimed": true, 00:14:16.634 "claim_type": "exclusive_write", 00:14:16.634 "zoned": false, 00:14:16.634 "supported_io_types": { 00:14:16.634 "read": true, 00:14:16.634 "write": true, 00:14:16.634 "unmap": true, 00:14:16.634 "write_zeroes": true, 00:14:16.634 "flush": true, 00:14:16.634 "reset": true, 00:14:16.634 "compare": false, 00:14:16.634 "compare_and_write": false, 00:14:16.634 "abort": true, 00:14:16.634 "nvme_admin": false, 00:14:16.634 "nvme_io": false 00:14:16.634 }, 00:14:16.634 "memory_domains": [ 00:14:16.634 { 00:14:16.634 "dma_device_id": "system", 00:14:16.634 "dma_device_type": 1 00:14:16.634 }, 00:14:16.634 { 00:14:16.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.634 "dma_device_type": 2 00:14:16.634 } 00:14:16.634 ], 00:14:16.634 "driver_specific": {} 00:14:16.634 }' 00:14:16.634 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.634 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:16.895 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:17.155 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.155 "name": "BaseBdev2", 00:14:17.155 "aliases": [ 00:14:17.155 "ab7f1a5c-d392-48e3-9c2b-93a78df423f8" 00:14:17.155 ], 00:14:17.155 "product_name": "Malloc disk", 00:14:17.155 "block_size": 512, 00:14:17.155 "num_blocks": 65536, 00:14:17.155 "uuid": "ab7f1a5c-d392-48e3-9c2b-93a78df423f8", 00:14:17.155 "assigned_rate_limits": { 00:14:17.155 "rw_ios_per_sec": 0, 00:14:17.155 "rw_mbytes_per_sec": 0, 00:14:17.155 "r_mbytes_per_sec": 0, 00:14:17.155 "w_mbytes_per_sec": 0 00:14:17.155 }, 00:14:17.155 "claimed": true, 00:14:17.155 "claim_type": "exclusive_write", 00:14:17.155 "zoned": false, 00:14:17.155 "supported_io_types": { 00:14:17.155 "read": true, 00:14:17.155 "write": true, 00:14:17.155 "unmap": true, 00:14:17.155 "write_zeroes": true, 00:14:17.155 "flush": true, 00:14:17.155 "reset": true, 00:14:17.155 "compare": false, 00:14:17.155 "compare_and_write": false, 00:14:17.155 "abort": true, 00:14:17.155 "nvme_admin": false, 00:14:17.155 "nvme_io": false 00:14:17.155 }, 00:14:17.155 "memory_domains": [ 00:14:17.155 { 00:14:17.155 "dma_device_id": "system", 00:14:17.155 "dma_device_type": 1 00:14:17.155 }, 00:14:17.155 { 00:14:17.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.155 "dma_device_type": 2 00:14:17.156 } 00:14:17.156 ], 00:14:17.156 "driver_specific": {} 00:14:17.156 }' 00:14:17.156 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.156 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.416 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:17.677 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:17.677 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.677 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:17.677 13:42:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.677 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.677 "name": "BaseBdev3", 00:14:17.677 "aliases": [ 00:14:17.677 "139bade1-aa7e-48d2-95c9-ce18462d7249" 00:14:17.677 ], 00:14:17.677 "product_name": "Malloc disk", 00:14:17.677 "block_size": 512, 00:14:17.677 "num_blocks": 65536, 00:14:17.677 "uuid": "139bade1-aa7e-48d2-95c9-ce18462d7249", 00:14:17.677 "assigned_rate_limits": { 00:14:17.677 "rw_ios_per_sec": 0, 00:14:17.677 "rw_mbytes_per_sec": 0, 00:14:17.677 "r_mbytes_per_sec": 0, 00:14:17.677 "w_mbytes_per_sec": 0 00:14:17.677 }, 00:14:17.677 "claimed": true, 00:14:17.677 "claim_type": "exclusive_write", 00:14:17.677 "zoned": false, 00:14:17.677 "supported_io_types": { 00:14:17.677 "read": true, 00:14:17.677 "write": true, 00:14:17.677 "unmap": true, 00:14:17.677 "write_zeroes": true, 00:14:17.677 "flush": true, 00:14:17.677 "reset": true, 00:14:17.677 "compare": false, 00:14:17.677 "compare_and_write": false, 00:14:17.677 "abort": true, 00:14:17.677 "nvme_admin": false, 00:14:17.677 "nvme_io": false 00:14:17.677 }, 00:14:17.677 "memory_domains": [ 00:14:17.677 { 00:14:17.677 "dma_device_id": "system", 00:14:17.677 "dma_device_type": 1 00:14:17.677 }, 00:14:17.677 { 00:14:17.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.677 "dma_device_type": 2 00:14:17.677 } 00:14:17.677 ], 00:14:17.677 "driver_specific": {} 00:14:17.677 }' 00:14:17.677 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:17.938 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.197 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.197 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.197 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:18.197 [2024-06-10 13:42:32.659061] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:18.458 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.459 "name": "Existed_Raid", 00:14:18.459 "uuid": "b9837d22-2ffe-4acb-8f10-d52a63388327", 00:14:18.459 "strip_size_kb": 0, 00:14:18.459 "state": "online", 00:14:18.459 "raid_level": "raid1", 00:14:18.459 "superblock": true, 00:14:18.459 "num_base_bdevs": 3, 00:14:18.459 "num_base_bdevs_discovered": 2, 00:14:18.459 "num_base_bdevs_operational": 2, 00:14:18.459 "base_bdevs_list": [ 00:14:18.459 { 00:14:18.459 "name": null, 00:14:18.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.459 "is_configured": false, 00:14:18.459 "data_offset": 2048, 00:14:18.459 "data_size": 63488 00:14:18.459 }, 00:14:18.459 { 00:14:18.459 "name": "BaseBdev2", 00:14:18.459 "uuid": "ab7f1a5c-d392-48e3-9c2b-93a78df423f8", 00:14:18.459 "is_configured": true, 00:14:18.459 "data_offset": 2048, 00:14:18.459 "data_size": 63488 00:14:18.459 }, 00:14:18.459 { 00:14:18.459 "name": "BaseBdev3", 00:14:18.459 "uuid": "139bade1-aa7e-48d2-95c9-ce18462d7249", 00:14:18.459 "is_configured": true, 00:14:18.459 "data_offset": 2048, 00:14:18.459 "data_size": 63488 00:14:18.459 } 00:14:18.459 ] 00:14:18.459 }' 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.459 13:42:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:19.029 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:19.029 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:19.029 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:19.029 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.290 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:19.290 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:19.290 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:19.550 [2024-06-10 13:42:33.830063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:19.550 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:19.550 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:19.550 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.550 13:42:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:19.811 [2024-06-10 13:42:34.241141] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:19.811 [2024-06-10 13:42:34.241209] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:19.811 [2024-06-10 13:42:34.247360] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:19.811 [2024-06-10 13:42:34.247385] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:19.811 [2024-06-10 13:42:34.247391] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7c8f00 name Existed_Raid, state offline 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:19.811 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.072 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:20.072 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:20.072 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:20.072 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:20.072 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:20.072 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:20.333 BaseBdev2 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:20.333 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:20.595 13:42:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:20.595 [ 00:14:20.595 { 00:14:20.595 "name": "BaseBdev2", 00:14:20.595 "aliases": [ 00:14:20.595 "8002c26e-28f2-4142-9529-858a0d72de54" 00:14:20.595 ], 00:14:20.595 "product_name": "Malloc disk", 00:14:20.595 "block_size": 512, 00:14:20.595 "num_blocks": 65536, 00:14:20.595 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:20.595 "assigned_rate_limits": { 00:14:20.595 "rw_ios_per_sec": 0, 00:14:20.595 "rw_mbytes_per_sec": 0, 00:14:20.595 "r_mbytes_per_sec": 0, 00:14:20.595 "w_mbytes_per_sec": 0 00:14:20.595 }, 00:14:20.595 "claimed": false, 00:14:20.595 "zoned": false, 00:14:20.595 "supported_io_types": { 00:14:20.595 "read": true, 00:14:20.595 "write": true, 00:14:20.595 "unmap": true, 00:14:20.595 "write_zeroes": true, 00:14:20.595 "flush": true, 00:14:20.595 "reset": true, 00:14:20.595 "compare": false, 00:14:20.595 "compare_and_write": false, 00:14:20.595 "abort": true, 00:14:20.595 "nvme_admin": false, 00:14:20.595 "nvme_io": false 00:14:20.595 }, 00:14:20.595 "memory_domains": [ 00:14:20.595 { 00:14:20.595 "dma_device_id": "system", 00:14:20.595 "dma_device_type": 1 00:14:20.595 }, 00:14:20.595 { 00:14:20.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.595 "dma_device_type": 2 00:14:20.595 } 00:14:20.595 ], 00:14:20.595 "driver_specific": {} 00:14:20.595 } 00:14:20.595 ] 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:20.856 BaseBdev3 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:20.856 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:21.117 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:21.378 [ 00:14:21.378 { 00:14:21.378 "name": "BaseBdev3", 00:14:21.378 "aliases": [ 00:14:21.378 "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6" 00:14:21.378 ], 00:14:21.378 "product_name": "Malloc disk", 00:14:21.378 "block_size": 512, 00:14:21.378 "num_blocks": 65536, 00:14:21.378 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:21.378 "assigned_rate_limits": { 00:14:21.378 "rw_ios_per_sec": 0, 00:14:21.378 "rw_mbytes_per_sec": 0, 00:14:21.378 "r_mbytes_per_sec": 0, 00:14:21.378 "w_mbytes_per_sec": 0 00:14:21.378 }, 00:14:21.378 "claimed": false, 00:14:21.378 "zoned": false, 00:14:21.378 "supported_io_types": { 00:14:21.378 "read": true, 00:14:21.378 "write": true, 00:14:21.378 "unmap": true, 00:14:21.378 "write_zeroes": true, 00:14:21.378 "flush": true, 00:14:21.378 "reset": true, 00:14:21.378 "compare": false, 00:14:21.378 "compare_and_write": false, 00:14:21.378 "abort": true, 00:14:21.378 "nvme_admin": false, 00:14:21.378 "nvme_io": false 00:14:21.378 }, 00:14:21.378 "memory_domains": [ 00:14:21.378 { 00:14:21.378 "dma_device_id": "system", 00:14:21.378 "dma_device_type": 1 00:14:21.378 }, 00:14:21.378 { 00:14:21.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.378 "dma_device_type": 2 00:14:21.378 } 00:14:21.378 ], 00:14:21.378 "driver_specific": {} 00:14:21.378 } 00:14:21.378 ] 00:14:21.378 13:42:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:21.378 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:21.378 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:21.378 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:21.639 [2024-06-10 13:42:35.857426] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:21.639 [2024-06-10 13:42:35.857455] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:21.639 [2024-06-10 13:42:35.857467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:21.639 [2024-06-10 13:42:35.858574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.639 13:42:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:21.639 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.639 "name": "Existed_Raid", 00:14:21.639 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:21.639 "strip_size_kb": 0, 00:14:21.639 "state": "configuring", 00:14:21.639 "raid_level": "raid1", 00:14:21.639 "superblock": true, 00:14:21.639 "num_base_bdevs": 3, 00:14:21.639 "num_base_bdevs_discovered": 2, 00:14:21.639 "num_base_bdevs_operational": 3, 00:14:21.639 "base_bdevs_list": [ 00:14:21.639 { 00:14:21.639 "name": "BaseBdev1", 00:14:21.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.639 "is_configured": false, 00:14:21.639 "data_offset": 0, 00:14:21.639 "data_size": 0 00:14:21.639 }, 00:14:21.639 { 00:14:21.639 "name": "BaseBdev2", 00:14:21.639 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:21.639 "is_configured": true, 00:14:21.639 "data_offset": 2048, 00:14:21.639 "data_size": 63488 00:14:21.639 }, 00:14:21.639 { 00:14:21.639 "name": "BaseBdev3", 00:14:21.639 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:21.639 "is_configured": true, 00:14:21.639 "data_offset": 2048, 00:14:21.639 "data_size": 63488 00:14:21.639 } 00:14:21.639 ] 00:14:21.639 }' 00:14:21.639 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.639 13:42:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:22.211 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:22.472 [2024-06-10 13:42:36.807824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.472 13:42:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.733 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.733 "name": "Existed_Raid", 00:14:22.733 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:22.733 "strip_size_kb": 0, 00:14:22.733 "state": "configuring", 00:14:22.733 "raid_level": "raid1", 00:14:22.733 "superblock": true, 00:14:22.733 "num_base_bdevs": 3, 00:14:22.733 "num_base_bdevs_discovered": 1, 00:14:22.733 "num_base_bdevs_operational": 3, 00:14:22.733 "base_bdevs_list": [ 00:14:22.733 { 00:14:22.733 "name": "BaseBdev1", 00:14:22.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.733 "is_configured": false, 00:14:22.733 "data_offset": 0, 00:14:22.733 "data_size": 0 00:14:22.733 }, 00:14:22.733 { 00:14:22.733 "name": null, 00:14:22.733 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:22.733 "is_configured": false, 00:14:22.733 "data_offset": 2048, 00:14:22.733 "data_size": 63488 00:14:22.733 }, 00:14:22.733 { 00:14:22.733 "name": "BaseBdev3", 00:14:22.733 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:22.733 "is_configured": true, 00:14:22.733 "data_offset": 2048, 00:14:22.733 "data_size": 63488 00:14:22.733 } 00:14:22.733 ] 00:14:22.733 }' 00:14:22.733 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.733 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.304 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.304 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:23.304 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:23.304 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:23.565 [2024-06-10 13:42:37.887664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:23.565 BaseBdev1 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:23.565 13:42:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:23.827 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:23.827 [ 00:14:23.827 { 00:14:23.827 "name": "BaseBdev1", 00:14:23.827 "aliases": [ 00:14:23.827 "3047fb05-8b45-4404-baf7-445c33c515da" 00:14:23.827 ], 00:14:23.827 "product_name": "Malloc disk", 00:14:23.827 "block_size": 512, 00:14:23.827 "num_blocks": 65536, 00:14:23.827 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:23.827 "assigned_rate_limits": { 00:14:23.827 "rw_ios_per_sec": 0, 00:14:23.827 "rw_mbytes_per_sec": 0, 00:14:23.827 "r_mbytes_per_sec": 0, 00:14:23.827 "w_mbytes_per_sec": 0 00:14:23.827 }, 00:14:23.827 "claimed": true, 00:14:23.827 "claim_type": "exclusive_write", 00:14:23.827 "zoned": false, 00:14:23.827 "supported_io_types": { 00:14:23.827 "read": true, 00:14:23.827 "write": true, 00:14:23.827 "unmap": true, 00:14:23.827 "write_zeroes": true, 00:14:23.827 "flush": true, 00:14:23.827 "reset": true, 00:14:23.827 "compare": false, 00:14:23.827 "compare_and_write": false, 00:14:23.827 "abort": true, 00:14:23.827 "nvme_admin": false, 00:14:23.827 "nvme_io": false 00:14:23.827 }, 00:14:23.827 "memory_domains": [ 00:14:23.827 { 00:14:23.827 "dma_device_id": "system", 00:14:23.827 "dma_device_type": 1 00:14:23.827 }, 00:14:23.827 { 00:14:23.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.827 "dma_device_type": 2 00:14:23.827 } 00:14:23.827 ], 00:14:23.827 "driver_specific": {} 00:14:23.827 } 00:14:23.827 ] 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.087 "name": "Existed_Raid", 00:14:24.087 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:24.087 "strip_size_kb": 0, 00:14:24.087 "state": "configuring", 00:14:24.087 "raid_level": "raid1", 00:14:24.087 "superblock": true, 00:14:24.087 "num_base_bdevs": 3, 00:14:24.087 "num_base_bdevs_discovered": 2, 00:14:24.087 "num_base_bdevs_operational": 3, 00:14:24.087 "base_bdevs_list": [ 00:14:24.087 { 00:14:24.087 "name": "BaseBdev1", 00:14:24.087 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:24.087 "is_configured": true, 00:14:24.087 "data_offset": 2048, 00:14:24.087 "data_size": 63488 00:14:24.087 }, 00:14:24.087 { 00:14:24.087 "name": null, 00:14:24.087 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:24.087 "is_configured": false, 00:14:24.087 "data_offset": 2048, 00:14:24.087 "data_size": 63488 00:14:24.087 }, 00:14:24.087 { 00:14:24.087 "name": "BaseBdev3", 00:14:24.087 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:24.087 "is_configured": true, 00:14:24.087 "data_offset": 2048, 00:14:24.087 "data_size": 63488 00:14:24.087 } 00:14:24.087 ] 00:14:24.087 }' 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.087 13:42:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:24.660 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.660 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:24.920 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:24.920 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:25.180 [2024-06-10 13:42:39.443680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.180 "name": "Existed_Raid", 00:14:25.180 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:25.180 "strip_size_kb": 0, 00:14:25.180 "state": "configuring", 00:14:25.180 "raid_level": "raid1", 00:14:25.180 "superblock": true, 00:14:25.180 "num_base_bdevs": 3, 00:14:25.180 "num_base_bdevs_discovered": 1, 00:14:25.180 "num_base_bdevs_operational": 3, 00:14:25.180 "base_bdevs_list": [ 00:14:25.180 { 00:14:25.180 "name": "BaseBdev1", 00:14:25.180 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:25.180 "is_configured": true, 00:14:25.180 "data_offset": 2048, 00:14:25.180 "data_size": 63488 00:14:25.180 }, 00:14:25.180 { 00:14:25.180 "name": null, 00:14:25.180 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:25.180 "is_configured": false, 00:14:25.180 "data_offset": 2048, 00:14:25.180 "data_size": 63488 00:14:25.180 }, 00:14:25.180 { 00:14:25.180 "name": null, 00:14:25.180 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:25.180 "is_configured": false, 00:14:25.180 "data_offset": 2048, 00:14:25.180 "data_size": 63488 00:14:25.180 } 00:14:25.180 ] 00:14:25.180 }' 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.180 13:42:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:25.750 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.750 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:26.012 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:26.012 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:26.273 [2024-06-10 13:42:40.578597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.273 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.534 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.534 "name": "Existed_Raid", 00:14:26.534 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:26.534 "strip_size_kb": 0, 00:14:26.534 "state": "configuring", 00:14:26.534 "raid_level": "raid1", 00:14:26.534 "superblock": true, 00:14:26.534 "num_base_bdevs": 3, 00:14:26.534 "num_base_bdevs_discovered": 2, 00:14:26.534 "num_base_bdevs_operational": 3, 00:14:26.534 "base_bdevs_list": [ 00:14:26.534 { 00:14:26.534 "name": "BaseBdev1", 00:14:26.534 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:26.534 "is_configured": true, 00:14:26.534 "data_offset": 2048, 00:14:26.534 "data_size": 63488 00:14:26.534 }, 00:14:26.534 { 00:14:26.534 "name": null, 00:14:26.534 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:26.534 "is_configured": false, 00:14:26.534 "data_offset": 2048, 00:14:26.534 "data_size": 63488 00:14:26.534 }, 00:14:26.534 { 00:14:26.534 "name": "BaseBdev3", 00:14:26.534 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:26.534 "is_configured": true, 00:14:26.534 "data_offset": 2048, 00:14:26.534 "data_size": 63488 00:14:26.534 } 00:14:26.534 ] 00:14:26.534 }' 00:14:26.534 13:42:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.534 13:42:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.105 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.105 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:27.366 [2024-06-10 13:42:41.781662] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.366 13:42:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.627 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.627 "name": "Existed_Raid", 00:14:27.627 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:27.627 "strip_size_kb": 0, 00:14:27.627 "state": "configuring", 00:14:27.627 "raid_level": "raid1", 00:14:27.627 "superblock": true, 00:14:27.627 "num_base_bdevs": 3, 00:14:27.627 "num_base_bdevs_discovered": 1, 00:14:27.627 "num_base_bdevs_operational": 3, 00:14:27.627 "base_bdevs_list": [ 00:14:27.627 { 00:14:27.627 "name": null, 00:14:27.627 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:27.627 "is_configured": false, 00:14:27.627 "data_offset": 2048, 00:14:27.627 "data_size": 63488 00:14:27.627 }, 00:14:27.627 { 00:14:27.627 "name": null, 00:14:27.627 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:27.627 "is_configured": false, 00:14:27.627 "data_offset": 2048, 00:14:27.627 "data_size": 63488 00:14:27.627 }, 00:14:27.627 { 00:14:27.627 "name": "BaseBdev3", 00:14:27.627 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:27.627 "is_configured": true, 00:14:27.627 "data_offset": 2048, 00:14:27.627 "data_size": 63488 00:14:27.627 } 00:14:27.627 ] 00:14:27.627 }' 00:14:27.627 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.627 13:42:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:28.223 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.223 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:28.550 [2024-06-10 13:42:42.962666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.550 13:42:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.811 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.811 "name": "Existed_Raid", 00:14:28.811 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:28.811 "strip_size_kb": 0, 00:14:28.811 "state": "configuring", 00:14:28.811 "raid_level": "raid1", 00:14:28.811 "superblock": true, 00:14:28.811 "num_base_bdevs": 3, 00:14:28.811 "num_base_bdevs_discovered": 2, 00:14:28.811 "num_base_bdevs_operational": 3, 00:14:28.811 "base_bdevs_list": [ 00:14:28.811 { 00:14:28.811 "name": null, 00:14:28.811 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:28.811 "is_configured": false, 00:14:28.811 "data_offset": 2048, 00:14:28.811 "data_size": 63488 00:14:28.811 }, 00:14:28.811 { 00:14:28.811 "name": "BaseBdev2", 00:14:28.811 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:28.811 "is_configured": true, 00:14:28.811 "data_offset": 2048, 00:14:28.811 "data_size": 63488 00:14:28.811 }, 00:14:28.811 { 00:14:28.811 "name": "BaseBdev3", 00:14:28.811 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:28.811 "is_configured": true, 00:14:28.811 "data_offset": 2048, 00:14:28.811 "data_size": 63488 00:14:28.811 } 00:14:28.811 ] 00:14:28.811 }' 00:14:28.811 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.811 13:42:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:29.382 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.382 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:29.643 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:29.643 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.643 13:42:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3047fb05-8b45-4404-baf7-445c33c515da 00:14:29.904 [2024-06-10 13:42:44.319204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:29.904 [2024-06-10 13:42:44.319313] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x97c290 00:14:29.904 [2024-06-10 13:42:44.319321] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:29.904 [2024-06-10 13:42:44.319468] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7c8710 00:14:29.904 [2024-06-10 13:42:44.319562] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x97c290 00:14:29.904 [2024-06-10 13:42:44.319568] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x97c290 00:14:29.904 [2024-06-10 13:42:44.319640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:29.904 NewBaseBdev 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:29.904 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:30.164 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:30.425 [ 00:14:30.425 { 00:14:30.425 "name": "NewBaseBdev", 00:14:30.425 "aliases": [ 00:14:30.425 "3047fb05-8b45-4404-baf7-445c33c515da" 00:14:30.425 ], 00:14:30.425 "product_name": "Malloc disk", 00:14:30.425 "block_size": 512, 00:14:30.425 "num_blocks": 65536, 00:14:30.425 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:30.425 "assigned_rate_limits": { 00:14:30.425 "rw_ios_per_sec": 0, 00:14:30.425 "rw_mbytes_per_sec": 0, 00:14:30.425 "r_mbytes_per_sec": 0, 00:14:30.425 "w_mbytes_per_sec": 0 00:14:30.425 }, 00:14:30.425 "claimed": true, 00:14:30.425 "claim_type": "exclusive_write", 00:14:30.425 "zoned": false, 00:14:30.425 "supported_io_types": { 00:14:30.425 "read": true, 00:14:30.425 "write": true, 00:14:30.425 "unmap": true, 00:14:30.425 "write_zeroes": true, 00:14:30.425 "flush": true, 00:14:30.425 "reset": true, 00:14:30.425 "compare": false, 00:14:30.425 "compare_and_write": false, 00:14:30.425 "abort": true, 00:14:30.425 "nvme_admin": false, 00:14:30.425 "nvme_io": false 00:14:30.425 }, 00:14:30.425 "memory_domains": [ 00:14:30.425 { 00:14:30.425 "dma_device_id": "system", 00:14:30.425 "dma_device_type": 1 00:14:30.425 }, 00:14:30.425 { 00:14:30.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.425 "dma_device_type": 2 00:14:30.425 } 00:14:30.425 ], 00:14:30.425 "driver_specific": {} 00:14:30.425 } 00:14:30.425 ] 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.425 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.686 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.686 "name": "Existed_Raid", 00:14:30.686 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:30.686 "strip_size_kb": 0, 00:14:30.686 "state": "online", 00:14:30.686 "raid_level": "raid1", 00:14:30.686 "superblock": true, 00:14:30.686 "num_base_bdevs": 3, 00:14:30.686 "num_base_bdevs_discovered": 3, 00:14:30.686 "num_base_bdevs_operational": 3, 00:14:30.686 "base_bdevs_list": [ 00:14:30.686 { 00:14:30.686 "name": "NewBaseBdev", 00:14:30.686 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:30.686 "is_configured": true, 00:14:30.686 "data_offset": 2048, 00:14:30.686 "data_size": 63488 00:14:30.686 }, 00:14:30.686 { 00:14:30.686 "name": "BaseBdev2", 00:14:30.686 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:30.686 "is_configured": true, 00:14:30.686 "data_offset": 2048, 00:14:30.686 "data_size": 63488 00:14:30.686 }, 00:14:30.686 { 00:14:30.686 "name": "BaseBdev3", 00:14:30.686 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:30.686 "is_configured": true, 00:14:30.686 "data_offset": 2048, 00:14:30.686 "data_size": 63488 00:14:30.686 } 00:14:30.686 ] 00:14:30.686 }' 00:14:30.686 13:42:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.686 13:42:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:31.258 [2024-06-10 13:42:45.694911] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:31.258 "name": "Existed_Raid", 00:14:31.258 "aliases": [ 00:14:31.258 "ff7ac9df-0daf-4959-a0ea-fb2dcc295086" 00:14:31.258 ], 00:14:31.258 "product_name": "Raid Volume", 00:14:31.258 "block_size": 512, 00:14:31.258 "num_blocks": 63488, 00:14:31.258 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:31.258 "assigned_rate_limits": { 00:14:31.258 "rw_ios_per_sec": 0, 00:14:31.258 "rw_mbytes_per_sec": 0, 00:14:31.258 "r_mbytes_per_sec": 0, 00:14:31.258 "w_mbytes_per_sec": 0 00:14:31.258 }, 00:14:31.258 "claimed": false, 00:14:31.258 "zoned": false, 00:14:31.258 "supported_io_types": { 00:14:31.258 "read": true, 00:14:31.258 "write": true, 00:14:31.258 "unmap": false, 00:14:31.258 "write_zeroes": true, 00:14:31.258 "flush": false, 00:14:31.258 "reset": true, 00:14:31.258 "compare": false, 00:14:31.258 "compare_and_write": false, 00:14:31.258 "abort": false, 00:14:31.258 "nvme_admin": false, 00:14:31.258 "nvme_io": false 00:14:31.258 }, 00:14:31.258 "memory_domains": [ 00:14:31.258 { 00:14:31.258 "dma_device_id": "system", 00:14:31.258 "dma_device_type": 1 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.258 "dma_device_type": 2 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "dma_device_id": "system", 00:14:31.258 "dma_device_type": 1 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.258 "dma_device_type": 2 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "dma_device_id": "system", 00:14:31.258 "dma_device_type": 1 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.258 "dma_device_type": 2 00:14:31.258 } 00:14:31.258 ], 00:14:31.258 "driver_specific": { 00:14:31.258 "raid": { 00:14:31.258 "uuid": "ff7ac9df-0daf-4959-a0ea-fb2dcc295086", 00:14:31.258 "strip_size_kb": 0, 00:14:31.258 "state": "online", 00:14:31.258 "raid_level": "raid1", 00:14:31.258 "superblock": true, 00:14:31.258 "num_base_bdevs": 3, 00:14:31.258 "num_base_bdevs_discovered": 3, 00:14:31.258 "num_base_bdevs_operational": 3, 00:14:31.258 "base_bdevs_list": [ 00:14:31.258 { 00:14:31.258 "name": "NewBaseBdev", 00:14:31.258 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:31.258 "is_configured": true, 00:14:31.258 "data_offset": 2048, 00:14:31.258 "data_size": 63488 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "name": "BaseBdev2", 00:14:31.258 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:31.258 "is_configured": true, 00:14:31.258 "data_offset": 2048, 00:14:31.258 "data_size": 63488 00:14:31.258 }, 00:14:31.258 { 00:14:31.258 "name": "BaseBdev3", 00:14:31.258 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:31.258 "is_configured": true, 00:14:31.258 "data_offset": 2048, 00:14:31.258 "data_size": 63488 00:14:31.258 } 00:14:31.258 ] 00:14:31.258 } 00:14:31.258 } 00:14:31.258 }' 00:14:31.258 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:31.520 BaseBdev2 00:14:31.520 BaseBdev3' 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:31.520 "name": "NewBaseBdev", 00:14:31.520 "aliases": [ 00:14:31.520 "3047fb05-8b45-4404-baf7-445c33c515da" 00:14:31.520 ], 00:14:31.520 "product_name": "Malloc disk", 00:14:31.520 "block_size": 512, 00:14:31.520 "num_blocks": 65536, 00:14:31.520 "uuid": "3047fb05-8b45-4404-baf7-445c33c515da", 00:14:31.520 "assigned_rate_limits": { 00:14:31.520 "rw_ios_per_sec": 0, 00:14:31.520 "rw_mbytes_per_sec": 0, 00:14:31.520 "r_mbytes_per_sec": 0, 00:14:31.520 "w_mbytes_per_sec": 0 00:14:31.520 }, 00:14:31.520 "claimed": true, 00:14:31.520 "claim_type": "exclusive_write", 00:14:31.520 "zoned": false, 00:14:31.520 "supported_io_types": { 00:14:31.520 "read": true, 00:14:31.520 "write": true, 00:14:31.520 "unmap": true, 00:14:31.520 "write_zeroes": true, 00:14:31.520 "flush": true, 00:14:31.520 "reset": true, 00:14:31.520 "compare": false, 00:14:31.520 "compare_and_write": false, 00:14:31.520 "abort": true, 00:14:31.520 "nvme_admin": false, 00:14:31.520 "nvme_io": false 00:14:31.520 }, 00:14:31.520 "memory_domains": [ 00:14:31.520 { 00:14:31.520 "dma_device_id": "system", 00:14:31.520 "dma_device_type": 1 00:14:31.520 }, 00:14:31.520 { 00:14:31.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.520 "dma_device_type": 2 00:14:31.520 } 00:14:31.520 ], 00:14:31.520 "driver_specific": {} 00:14:31.520 }' 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:31.520 13:42:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:31.781 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:32.041 "name": "BaseBdev2", 00:14:32.041 "aliases": [ 00:14:32.041 "8002c26e-28f2-4142-9529-858a0d72de54" 00:14:32.041 ], 00:14:32.041 "product_name": "Malloc disk", 00:14:32.041 "block_size": 512, 00:14:32.041 "num_blocks": 65536, 00:14:32.041 "uuid": "8002c26e-28f2-4142-9529-858a0d72de54", 00:14:32.041 "assigned_rate_limits": { 00:14:32.041 "rw_ios_per_sec": 0, 00:14:32.041 "rw_mbytes_per_sec": 0, 00:14:32.041 "r_mbytes_per_sec": 0, 00:14:32.041 "w_mbytes_per_sec": 0 00:14:32.041 }, 00:14:32.041 "claimed": true, 00:14:32.041 "claim_type": "exclusive_write", 00:14:32.041 "zoned": false, 00:14:32.041 "supported_io_types": { 00:14:32.041 "read": true, 00:14:32.041 "write": true, 00:14:32.041 "unmap": true, 00:14:32.041 "write_zeroes": true, 00:14:32.041 "flush": true, 00:14:32.041 "reset": true, 00:14:32.041 "compare": false, 00:14:32.041 "compare_and_write": false, 00:14:32.041 "abort": true, 00:14:32.041 "nvme_admin": false, 00:14:32.041 "nvme_io": false 00:14:32.041 }, 00:14:32.041 "memory_domains": [ 00:14:32.041 { 00:14:32.041 "dma_device_id": "system", 00:14:32.041 "dma_device_type": 1 00:14:32.041 }, 00:14:32.041 { 00:14:32.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.041 "dma_device_type": 2 00:14:32.041 } 00:14:32.041 ], 00:14:32.041 "driver_specific": {} 00:14:32.041 }' 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.041 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:32.301 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.562 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.562 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:32.562 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.562 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:32.562 13:42:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.562 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:32.562 "name": "BaseBdev3", 00:14:32.562 "aliases": [ 00:14:32.562 "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6" 00:14:32.562 ], 00:14:32.562 "product_name": "Malloc disk", 00:14:32.562 "block_size": 512, 00:14:32.562 "num_blocks": 65536, 00:14:32.562 "uuid": "c30fdf0a-1114-47a6-b2b7-35023b5ac6b6", 00:14:32.562 "assigned_rate_limits": { 00:14:32.562 "rw_ios_per_sec": 0, 00:14:32.562 "rw_mbytes_per_sec": 0, 00:14:32.562 "r_mbytes_per_sec": 0, 00:14:32.562 "w_mbytes_per_sec": 0 00:14:32.562 }, 00:14:32.562 "claimed": true, 00:14:32.562 "claim_type": "exclusive_write", 00:14:32.562 "zoned": false, 00:14:32.562 "supported_io_types": { 00:14:32.562 "read": true, 00:14:32.562 "write": true, 00:14:32.562 "unmap": true, 00:14:32.562 "write_zeroes": true, 00:14:32.562 "flush": true, 00:14:32.562 "reset": true, 00:14:32.562 "compare": false, 00:14:32.562 "compare_and_write": false, 00:14:32.562 "abort": true, 00:14:32.562 "nvme_admin": false, 00:14:32.562 "nvme_io": false 00:14:32.562 }, 00:14:32.562 "memory_domains": [ 00:14:32.562 { 00:14:32.562 "dma_device_id": "system", 00:14:32.562 "dma_device_type": 1 00:14:32.562 }, 00:14:32.562 { 00:14:32.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.562 "dma_device_type": 2 00:14:32.562 } 00:14:32.562 ], 00:14:32.562 "driver_specific": {} 00:14:32.562 }' 00:14:32.562 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:32.822 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.082 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.082 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.082 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:33.343 [2024-06-10 13:42:47.559447] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:33.343 [2024-06-10 13:42:47.559462] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:33.343 [2024-06-10 13:42:47.559501] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:33.343 [2024-06-10 13:42:47.559722] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:33.343 [2024-06-10 13:42:47.559729] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x97c290 name Existed_Raid, state offline 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1551131 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1551131 ']' 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1551131 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1551131 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1551131' 00:14:33.343 killing process with pid 1551131 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1551131 00:14:33.343 [2024-06-10 13:42:47.629503] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1551131 00:14:33.343 [2024-06-10 13:42:47.644571] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:33.343 00:14:33.343 real 0m24.764s 00:14:33.343 user 0m46.383s 00:14:33.343 sys 0m3.659s 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:33.343 13:42:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:33.343 ************************************ 00:14:33.343 END TEST raid_state_function_test_sb 00:14:33.343 ************************************ 00:14:33.343 13:42:47 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:14:33.343 13:42:47 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:14:33.343 13:42:47 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:33.343 13:42:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:33.604 ************************************ 00:14:33.604 START TEST raid_superblock_test 00:14:33.604 ************************************ 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 3 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1556673 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1556673 /var/tmp/spdk-raid.sock 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1556673 ']' 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:33.604 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:33.604 13:42:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.604 [2024-06-10 13:42:47.903670] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:14:33.604 [2024-06-10 13:42:47.903721] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1556673 ] 00:14:33.604 [2024-06-10 13:42:47.990013] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:33.604 [2024-06-10 13:42:48.054667] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:14:33.864 [2024-06-10 13:42:48.103460] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:33.864 [2024-06-10 13:42:48.103485] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:34.436 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:34.699 malloc1 00:14:34.699 13:42:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:34.699 [2024-06-10 13:42:49.150776] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:34.699 [2024-06-10 13:42:49.150810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:34.699 [2024-06-10 13:42:49.150823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d0550 00:14:34.699 [2024-06-10 13:42:49.150830] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:34.699 [2024-06-10 13:42:49.152172] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:34.699 [2024-06-10 13:42:49.152194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:34.699 pt1 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:34.699 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:34.960 malloc2 00:14:34.960 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:35.220 [2024-06-10 13:42:49.553936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:35.220 [2024-06-10 13:42:49.553965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.220 [2024-06-10 13:42:49.553976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27920f0 00:14:35.220 [2024-06-10 13:42:49.553982] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.220 [2024-06-10 13:42:49.555238] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.220 [2024-06-10 13:42:49.555258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:35.220 pt2 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:35.220 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:35.479 malloc3 00:14:35.479 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:35.739 [2024-06-10 13:42:49.956974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:35.739 [2024-06-10 13:42:49.957002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.739 [2024-06-10 13:42:49.957012] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27935b0 00:14:35.739 [2024-06-10 13:42:49.957018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.739 [2024-06-10 13:42:49.958274] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.739 [2024-06-10 13:42:49.958293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:35.739 pt3 00:14:35.739 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:35.739 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:35.739 13:42:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:35.739 [2024-06-10 13:42:50.157503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:35.739 [2024-06-10 13:42:50.158644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:35.739 [2024-06-10 13:42:50.158691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:35.739 [2024-06-10 13:42:50.158829] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c8f30 00:14:35.739 [2024-06-10 13:42:50.158837] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:35.739 [2024-06-10 13:42:50.159010] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26d2660 00:14:35.739 [2024-06-10 13:42:50.159132] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c8f30 00:14:35.739 [2024-06-10 13:42:50.159138] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c8f30 00:14:35.739 [2024-06-10 13:42:50.159225] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.739 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:36.000 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.000 "name": "raid_bdev1", 00:14:36.000 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:36.000 "strip_size_kb": 0, 00:14:36.000 "state": "online", 00:14:36.000 "raid_level": "raid1", 00:14:36.000 "superblock": true, 00:14:36.000 "num_base_bdevs": 3, 00:14:36.000 "num_base_bdevs_discovered": 3, 00:14:36.000 "num_base_bdevs_operational": 3, 00:14:36.000 "base_bdevs_list": [ 00:14:36.000 { 00:14:36.000 "name": "pt1", 00:14:36.001 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:36.001 "is_configured": true, 00:14:36.001 "data_offset": 2048, 00:14:36.001 "data_size": 63488 00:14:36.001 }, 00:14:36.001 { 00:14:36.001 "name": "pt2", 00:14:36.001 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:36.001 "is_configured": true, 00:14:36.001 "data_offset": 2048, 00:14:36.001 "data_size": 63488 00:14:36.001 }, 00:14:36.001 { 00:14:36.001 "name": "pt3", 00:14:36.001 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:36.001 "is_configured": true, 00:14:36.001 "data_offset": 2048, 00:14:36.001 "data_size": 63488 00:14:36.001 } 00:14:36.001 ] 00:14:36.001 }' 00:14:36.001 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.001 13:42:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:36.571 13:42:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:36.831 [2024-06-10 13:42:51.104101] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:36.831 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:36.831 "name": "raid_bdev1", 00:14:36.831 "aliases": [ 00:14:36.831 "ecc8b44c-472f-42a0-8d55-9a263723ed98" 00:14:36.831 ], 00:14:36.831 "product_name": "Raid Volume", 00:14:36.831 "block_size": 512, 00:14:36.831 "num_blocks": 63488, 00:14:36.831 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:36.831 "assigned_rate_limits": { 00:14:36.831 "rw_ios_per_sec": 0, 00:14:36.831 "rw_mbytes_per_sec": 0, 00:14:36.831 "r_mbytes_per_sec": 0, 00:14:36.831 "w_mbytes_per_sec": 0 00:14:36.831 }, 00:14:36.831 "claimed": false, 00:14:36.831 "zoned": false, 00:14:36.831 "supported_io_types": { 00:14:36.831 "read": true, 00:14:36.831 "write": true, 00:14:36.831 "unmap": false, 00:14:36.831 "write_zeroes": true, 00:14:36.831 "flush": false, 00:14:36.831 "reset": true, 00:14:36.831 "compare": false, 00:14:36.831 "compare_and_write": false, 00:14:36.831 "abort": false, 00:14:36.831 "nvme_admin": false, 00:14:36.831 "nvme_io": false 00:14:36.831 }, 00:14:36.831 "memory_domains": [ 00:14:36.831 { 00:14:36.831 "dma_device_id": "system", 00:14:36.831 "dma_device_type": 1 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.831 "dma_device_type": 2 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "dma_device_id": "system", 00:14:36.831 "dma_device_type": 1 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.831 "dma_device_type": 2 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "dma_device_id": "system", 00:14:36.831 "dma_device_type": 1 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.831 "dma_device_type": 2 00:14:36.831 } 00:14:36.831 ], 00:14:36.831 "driver_specific": { 00:14:36.831 "raid": { 00:14:36.831 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:36.831 "strip_size_kb": 0, 00:14:36.831 "state": "online", 00:14:36.831 "raid_level": "raid1", 00:14:36.831 "superblock": true, 00:14:36.831 "num_base_bdevs": 3, 00:14:36.831 "num_base_bdevs_discovered": 3, 00:14:36.831 "num_base_bdevs_operational": 3, 00:14:36.831 "base_bdevs_list": [ 00:14:36.831 { 00:14:36.831 "name": "pt1", 00:14:36.831 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:36.831 "is_configured": true, 00:14:36.831 "data_offset": 2048, 00:14:36.831 "data_size": 63488 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "name": "pt2", 00:14:36.831 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:36.831 "is_configured": true, 00:14:36.831 "data_offset": 2048, 00:14:36.831 "data_size": 63488 00:14:36.831 }, 00:14:36.831 { 00:14:36.831 "name": "pt3", 00:14:36.831 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:36.831 "is_configured": true, 00:14:36.831 "data_offset": 2048, 00:14:36.831 "data_size": 63488 00:14:36.831 } 00:14:36.831 ] 00:14:36.831 } 00:14:36.831 } 00:14:36.831 }' 00:14:36.831 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:36.831 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:36.831 pt2 00:14:36.831 pt3' 00:14:36.831 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:36.831 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:36.831 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:37.091 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:37.091 "name": "pt1", 00:14:37.091 "aliases": [ 00:14:37.091 "00000000-0000-0000-0000-000000000001" 00:14:37.091 ], 00:14:37.091 "product_name": "passthru", 00:14:37.091 "block_size": 512, 00:14:37.091 "num_blocks": 65536, 00:14:37.091 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:37.091 "assigned_rate_limits": { 00:14:37.091 "rw_ios_per_sec": 0, 00:14:37.091 "rw_mbytes_per_sec": 0, 00:14:37.091 "r_mbytes_per_sec": 0, 00:14:37.091 "w_mbytes_per_sec": 0 00:14:37.091 }, 00:14:37.091 "claimed": true, 00:14:37.091 "claim_type": "exclusive_write", 00:14:37.091 "zoned": false, 00:14:37.091 "supported_io_types": { 00:14:37.092 "read": true, 00:14:37.092 "write": true, 00:14:37.092 "unmap": true, 00:14:37.092 "write_zeroes": true, 00:14:37.092 "flush": true, 00:14:37.092 "reset": true, 00:14:37.092 "compare": false, 00:14:37.092 "compare_and_write": false, 00:14:37.092 "abort": true, 00:14:37.092 "nvme_admin": false, 00:14:37.092 "nvme_io": false 00:14:37.092 }, 00:14:37.092 "memory_domains": [ 00:14:37.092 { 00:14:37.092 "dma_device_id": "system", 00:14:37.092 "dma_device_type": 1 00:14:37.092 }, 00:14:37.092 { 00:14:37.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.092 "dma_device_type": 2 00:14:37.092 } 00:14:37.092 ], 00:14:37.092 "driver_specific": { 00:14:37.092 "passthru": { 00:14:37.092 "name": "pt1", 00:14:37.092 "base_bdev_name": "malloc1" 00:14:37.092 } 00:14:37.092 } 00:14:37.092 }' 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:37.092 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:37.351 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:37.611 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:37.611 "name": "pt2", 00:14:37.611 "aliases": [ 00:14:37.611 "00000000-0000-0000-0000-000000000002" 00:14:37.611 ], 00:14:37.611 "product_name": "passthru", 00:14:37.611 "block_size": 512, 00:14:37.611 "num_blocks": 65536, 00:14:37.611 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:37.611 "assigned_rate_limits": { 00:14:37.611 "rw_ios_per_sec": 0, 00:14:37.611 "rw_mbytes_per_sec": 0, 00:14:37.611 "r_mbytes_per_sec": 0, 00:14:37.611 "w_mbytes_per_sec": 0 00:14:37.611 }, 00:14:37.611 "claimed": true, 00:14:37.611 "claim_type": "exclusive_write", 00:14:37.611 "zoned": false, 00:14:37.611 "supported_io_types": { 00:14:37.611 "read": true, 00:14:37.611 "write": true, 00:14:37.611 "unmap": true, 00:14:37.611 "write_zeroes": true, 00:14:37.611 "flush": true, 00:14:37.611 "reset": true, 00:14:37.611 "compare": false, 00:14:37.611 "compare_and_write": false, 00:14:37.611 "abort": true, 00:14:37.611 "nvme_admin": false, 00:14:37.611 "nvme_io": false 00:14:37.611 }, 00:14:37.611 "memory_domains": [ 00:14:37.611 { 00:14:37.611 "dma_device_id": "system", 00:14:37.611 "dma_device_type": 1 00:14:37.611 }, 00:14:37.611 { 00:14:37.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.611 "dma_device_type": 2 00:14:37.611 } 00:14:37.611 ], 00:14:37.611 "driver_specific": { 00:14:37.611 "passthru": { 00:14:37.611 "name": "pt2", 00:14:37.611 "base_bdev_name": "malloc2" 00:14:37.611 } 00:14:37.611 } 00:14:37.611 }' 00:14:37.611 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.611 13:42:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.611 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:37.612 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.612 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:37.871 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:38.131 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:38.131 "name": "pt3", 00:14:38.131 "aliases": [ 00:14:38.131 "00000000-0000-0000-0000-000000000003" 00:14:38.131 ], 00:14:38.131 "product_name": "passthru", 00:14:38.131 "block_size": 512, 00:14:38.131 "num_blocks": 65536, 00:14:38.131 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:38.131 "assigned_rate_limits": { 00:14:38.131 "rw_ios_per_sec": 0, 00:14:38.131 "rw_mbytes_per_sec": 0, 00:14:38.131 "r_mbytes_per_sec": 0, 00:14:38.131 "w_mbytes_per_sec": 0 00:14:38.131 }, 00:14:38.131 "claimed": true, 00:14:38.131 "claim_type": "exclusive_write", 00:14:38.131 "zoned": false, 00:14:38.131 "supported_io_types": { 00:14:38.131 "read": true, 00:14:38.131 "write": true, 00:14:38.131 "unmap": true, 00:14:38.131 "write_zeroes": true, 00:14:38.131 "flush": true, 00:14:38.131 "reset": true, 00:14:38.131 "compare": false, 00:14:38.131 "compare_and_write": false, 00:14:38.131 "abort": true, 00:14:38.131 "nvme_admin": false, 00:14:38.131 "nvme_io": false 00:14:38.131 }, 00:14:38.131 "memory_domains": [ 00:14:38.131 { 00:14:38.131 "dma_device_id": "system", 00:14:38.131 "dma_device_type": 1 00:14:38.131 }, 00:14:38.131 { 00:14:38.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.131 "dma_device_type": 2 00:14:38.131 } 00:14:38.131 ], 00:14:38.131 "driver_specific": { 00:14:38.131 "passthru": { 00:14:38.131 "name": "pt3", 00:14:38.131 "base_bdev_name": "malloc3" 00:14:38.131 } 00:14:38.131 } 00:14:38.131 }' 00:14:38.131 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.131 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.131 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:38.131 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:38.391 13:42:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:38.651 [2024-06-10 13:42:53.020966] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:38.651 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ecc8b44c-472f-42a0-8d55-9a263723ed98 00:14:38.651 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ecc8b44c-472f-42a0-8d55-9a263723ed98 ']' 00:14:38.651 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:38.911 [2024-06-10 13:42:53.177157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:38.911 [2024-06-10 13:42:53.177170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:38.911 [2024-06-10 13:42:53.177207] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:38.911 [2024-06-10 13:42:53.177262] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:38.911 [2024-06-10 13:42:53.177269] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c8f30 name raid_bdev1, state offline 00:14:38.911 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.911 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:39.170 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:39.170 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:39.170 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:39.170 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:39.170 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:39.170 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:39.430 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:39.430 13:42:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:39.690 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:39.690 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:39.950 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:39.950 [2024-06-10 13:42:54.412263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:39.950 [2024-06-10 13:42:54.413454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:39.950 [2024-06-10 13:42:54.413490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:39.950 [2024-06-10 13:42:54.413527] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:39.950 [2024-06-10 13:42:54.413556] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:39.950 [2024-06-10 13:42:54.413571] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:39.950 [2024-06-10 13:42:54.413581] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:39.950 [2024-06-10 13:42:54.413587] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c8330 name raid_bdev1, state configuring 00:14:39.950 request: 00:14:39.950 { 00:14:39.950 "name": "raid_bdev1", 00:14:39.950 "raid_level": "raid1", 00:14:39.950 "base_bdevs": [ 00:14:39.950 "malloc1", 00:14:39.950 "malloc2", 00:14:39.950 "malloc3" 00:14:39.950 ], 00:14:39.950 "superblock": false, 00:14:39.950 "method": "bdev_raid_create", 00:14:39.950 "req_id": 1 00:14:39.950 } 00:14:39.950 Got JSON-RPC error response 00:14:39.950 response: 00:14:39.950 { 00:14:39.950 "code": -17, 00:14:39.950 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:39.950 } 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:40.210 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:40.470 [2024-06-10 13:42:54.817240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:40.470 [2024-06-10 13:42:54.817271] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:40.470 [2024-06-10 13:42:54.817283] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d37c0 00:14:40.470 [2024-06-10 13:42:54.817289] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:40.470 [2024-06-10 13:42:54.818689] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:40.470 [2024-06-10 13:42:54.818710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:40.470 [2024-06-10 13:42:54.818758] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:40.470 [2024-06-10 13:42:54.818776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:40.470 pt1 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.470 13:42:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:40.729 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.729 "name": "raid_bdev1", 00:14:40.729 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:40.729 "strip_size_kb": 0, 00:14:40.729 "state": "configuring", 00:14:40.729 "raid_level": "raid1", 00:14:40.729 "superblock": true, 00:14:40.729 "num_base_bdevs": 3, 00:14:40.729 "num_base_bdevs_discovered": 1, 00:14:40.729 "num_base_bdevs_operational": 3, 00:14:40.729 "base_bdevs_list": [ 00:14:40.729 { 00:14:40.729 "name": "pt1", 00:14:40.729 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:40.729 "is_configured": true, 00:14:40.729 "data_offset": 2048, 00:14:40.729 "data_size": 63488 00:14:40.729 }, 00:14:40.729 { 00:14:40.729 "name": null, 00:14:40.729 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:40.729 "is_configured": false, 00:14:40.729 "data_offset": 2048, 00:14:40.729 "data_size": 63488 00:14:40.729 }, 00:14:40.729 { 00:14:40.729 "name": null, 00:14:40.729 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:40.730 "is_configured": false, 00:14:40.730 "data_offset": 2048, 00:14:40.730 "data_size": 63488 00:14:40.730 } 00:14:40.730 ] 00:14:40.730 }' 00:14:40.730 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.730 13:42:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.299 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:41.299 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:41.299 [2024-06-10 13:42:55.763640] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:41.299 [2024-06-10 13:42:55.763679] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:41.299 [2024-06-10 13:42:55.763692] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2794110 00:14:41.299 [2024-06-10 13:42:55.763699] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:41.299 [2024-06-10 13:42:55.763991] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:41.299 [2024-06-10 13:42:55.764004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:41.299 [2024-06-10 13:42:55.764051] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:41.299 [2024-06-10 13:42:55.764065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:41.299 pt2 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:41.558 [2024-06-10 13:42:55.964181] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.558 13:42:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:41.818 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.818 "name": "raid_bdev1", 00:14:41.818 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:41.818 "strip_size_kb": 0, 00:14:41.818 "state": "configuring", 00:14:41.818 "raid_level": "raid1", 00:14:41.818 "superblock": true, 00:14:41.818 "num_base_bdevs": 3, 00:14:41.818 "num_base_bdevs_discovered": 1, 00:14:41.818 "num_base_bdevs_operational": 3, 00:14:41.818 "base_bdevs_list": [ 00:14:41.818 { 00:14:41.818 "name": "pt1", 00:14:41.818 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:41.818 "is_configured": true, 00:14:41.818 "data_offset": 2048, 00:14:41.818 "data_size": 63488 00:14:41.818 }, 00:14:41.818 { 00:14:41.818 "name": null, 00:14:41.818 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:41.818 "is_configured": false, 00:14:41.818 "data_offset": 2048, 00:14:41.818 "data_size": 63488 00:14:41.818 }, 00:14:41.818 { 00:14:41.818 "name": null, 00:14:41.818 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:41.818 "is_configured": false, 00:14:41.818 "data_offset": 2048, 00:14:41.818 "data_size": 63488 00:14:41.818 } 00:14:41.818 ] 00:14:41.818 }' 00:14:41.818 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.818 13:42:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.388 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:42.388 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:42.388 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:42.648 [2024-06-10 13:42:56.918586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:42.648 [2024-06-10 13:42:56.918626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:42.648 [2024-06-10 13:42:56.918639] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d27c0 00:14:42.648 [2024-06-10 13:42:56.918646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:42.648 [2024-06-10 13:42:56.918946] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:42.648 [2024-06-10 13:42:56.918959] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:42.648 [2024-06-10 13:42:56.919006] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:42.648 [2024-06-10 13:42:56.919019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:42.648 pt2 00:14:42.648 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:42.648 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:42.648 13:42:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:42.648 [2024-06-10 13:42:57.119090] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:42.648 [2024-06-10 13:42:57.119116] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:42.648 [2024-06-10 13:42:57.119127] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26c9a50 00:14:42.648 [2024-06-10 13:42:57.119134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:42.648 [2024-06-10 13:42:57.119399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:42.648 [2024-06-10 13:42:57.119411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:42.648 [2024-06-10 13:42:57.119450] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:42.648 [2024-06-10 13:42:57.119462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:42.648 [2024-06-10 13:42:57.119550] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c93f0 00:14:42.648 [2024-06-10 13:42:57.119557] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:42.648 [2024-06-10 13:42:57.119703] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2882470 00:14:42.648 [2024-06-10 13:42:57.119811] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c93f0 00:14:42.648 [2024-06-10 13:42:57.119816] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c93f0 00:14:42.648 [2024-06-10 13:42:57.119893] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:42.909 pt3 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.909 "name": "raid_bdev1", 00:14:42.909 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:42.909 "strip_size_kb": 0, 00:14:42.909 "state": "online", 00:14:42.909 "raid_level": "raid1", 00:14:42.909 "superblock": true, 00:14:42.909 "num_base_bdevs": 3, 00:14:42.909 "num_base_bdevs_discovered": 3, 00:14:42.909 "num_base_bdevs_operational": 3, 00:14:42.909 "base_bdevs_list": [ 00:14:42.909 { 00:14:42.909 "name": "pt1", 00:14:42.909 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:42.909 "is_configured": true, 00:14:42.909 "data_offset": 2048, 00:14:42.909 "data_size": 63488 00:14:42.909 }, 00:14:42.909 { 00:14:42.909 "name": "pt2", 00:14:42.909 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:42.909 "is_configured": true, 00:14:42.909 "data_offset": 2048, 00:14:42.909 "data_size": 63488 00:14:42.909 }, 00:14:42.909 { 00:14:42.909 "name": "pt3", 00:14:42.909 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:42.909 "is_configured": true, 00:14:42.909 "data_offset": 2048, 00:14:42.909 "data_size": 63488 00:14:42.909 } 00:14:42.909 ] 00:14:42.909 }' 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.909 13:42:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:43.480 13:42:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:43.740 [2024-06-10 13:42:58.041648] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:43.740 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:43.740 "name": "raid_bdev1", 00:14:43.740 "aliases": [ 00:14:43.740 "ecc8b44c-472f-42a0-8d55-9a263723ed98" 00:14:43.740 ], 00:14:43.740 "product_name": "Raid Volume", 00:14:43.740 "block_size": 512, 00:14:43.740 "num_blocks": 63488, 00:14:43.740 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:43.740 "assigned_rate_limits": { 00:14:43.740 "rw_ios_per_sec": 0, 00:14:43.740 "rw_mbytes_per_sec": 0, 00:14:43.740 "r_mbytes_per_sec": 0, 00:14:43.740 "w_mbytes_per_sec": 0 00:14:43.740 }, 00:14:43.740 "claimed": false, 00:14:43.740 "zoned": false, 00:14:43.740 "supported_io_types": { 00:14:43.740 "read": true, 00:14:43.740 "write": true, 00:14:43.740 "unmap": false, 00:14:43.740 "write_zeroes": true, 00:14:43.740 "flush": false, 00:14:43.740 "reset": true, 00:14:43.740 "compare": false, 00:14:43.740 "compare_and_write": false, 00:14:43.740 "abort": false, 00:14:43.740 "nvme_admin": false, 00:14:43.740 "nvme_io": false 00:14:43.740 }, 00:14:43.740 "memory_domains": [ 00:14:43.740 { 00:14:43.740 "dma_device_id": "system", 00:14:43.740 "dma_device_type": 1 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.740 "dma_device_type": 2 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "dma_device_id": "system", 00:14:43.740 "dma_device_type": 1 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.740 "dma_device_type": 2 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "dma_device_id": "system", 00:14:43.740 "dma_device_type": 1 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.740 "dma_device_type": 2 00:14:43.740 } 00:14:43.740 ], 00:14:43.740 "driver_specific": { 00:14:43.740 "raid": { 00:14:43.740 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:43.740 "strip_size_kb": 0, 00:14:43.740 "state": "online", 00:14:43.740 "raid_level": "raid1", 00:14:43.740 "superblock": true, 00:14:43.740 "num_base_bdevs": 3, 00:14:43.740 "num_base_bdevs_discovered": 3, 00:14:43.740 "num_base_bdevs_operational": 3, 00:14:43.740 "base_bdevs_list": [ 00:14:43.740 { 00:14:43.740 "name": "pt1", 00:14:43.740 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:43.740 "is_configured": true, 00:14:43.740 "data_offset": 2048, 00:14:43.740 "data_size": 63488 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "name": "pt2", 00:14:43.740 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:43.740 "is_configured": true, 00:14:43.740 "data_offset": 2048, 00:14:43.740 "data_size": 63488 00:14:43.740 }, 00:14:43.740 { 00:14:43.740 "name": "pt3", 00:14:43.740 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:43.740 "is_configured": true, 00:14:43.740 "data_offset": 2048, 00:14:43.740 "data_size": 63488 00:14:43.740 } 00:14:43.740 ] 00:14:43.740 } 00:14:43.740 } 00:14:43.740 }' 00:14:43.740 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:43.740 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:43.740 pt2 00:14:43.740 pt3' 00:14:43.740 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:43.740 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:43.740 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.000 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.000 "name": "pt1", 00:14:44.000 "aliases": [ 00:14:44.000 "00000000-0000-0000-0000-000000000001" 00:14:44.000 ], 00:14:44.000 "product_name": "passthru", 00:14:44.001 "block_size": 512, 00:14:44.001 "num_blocks": 65536, 00:14:44.001 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:44.001 "assigned_rate_limits": { 00:14:44.001 "rw_ios_per_sec": 0, 00:14:44.001 "rw_mbytes_per_sec": 0, 00:14:44.001 "r_mbytes_per_sec": 0, 00:14:44.001 "w_mbytes_per_sec": 0 00:14:44.001 }, 00:14:44.001 "claimed": true, 00:14:44.001 "claim_type": "exclusive_write", 00:14:44.001 "zoned": false, 00:14:44.001 "supported_io_types": { 00:14:44.001 "read": true, 00:14:44.001 "write": true, 00:14:44.001 "unmap": true, 00:14:44.001 "write_zeroes": true, 00:14:44.001 "flush": true, 00:14:44.001 "reset": true, 00:14:44.001 "compare": false, 00:14:44.001 "compare_and_write": false, 00:14:44.001 "abort": true, 00:14:44.001 "nvme_admin": false, 00:14:44.001 "nvme_io": false 00:14:44.001 }, 00:14:44.001 "memory_domains": [ 00:14:44.001 { 00:14:44.001 "dma_device_id": "system", 00:14:44.001 "dma_device_type": 1 00:14:44.001 }, 00:14:44.001 { 00:14:44.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.001 "dma_device_type": 2 00:14:44.001 } 00:14:44.001 ], 00:14:44.001 "driver_specific": { 00:14:44.001 "passthru": { 00:14:44.001 "name": "pt1", 00:14:44.001 "base_bdev_name": "malloc1" 00:14:44.001 } 00:14:44.001 } 00:14:44.001 }' 00:14:44.001 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.001 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.001 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.001 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.001 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:44.261 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.521 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.521 "name": "pt2", 00:14:44.521 "aliases": [ 00:14:44.521 "00000000-0000-0000-0000-000000000002" 00:14:44.521 ], 00:14:44.521 "product_name": "passthru", 00:14:44.521 "block_size": 512, 00:14:44.521 "num_blocks": 65536, 00:14:44.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:44.521 "assigned_rate_limits": { 00:14:44.521 "rw_ios_per_sec": 0, 00:14:44.521 "rw_mbytes_per_sec": 0, 00:14:44.521 "r_mbytes_per_sec": 0, 00:14:44.521 "w_mbytes_per_sec": 0 00:14:44.521 }, 00:14:44.521 "claimed": true, 00:14:44.521 "claim_type": "exclusive_write", 00:14:44.521 "zoned": false, 00:14:44.521 "supported_io_types": { 00:14:44.521 "read": true, 00:14:44.521 "write": true, 00:14:44.521 "unmap": true, 00:14:44.521 "write_zeroes": true, 00:14:44.521 "flush": true, 00:14:44.521 "reset": true, 00:14:44.521 "compare": false, 00:14:44.521 "compare_and_write": false, 00:14:44.521 "abort": true, 00:14:44.521 "nvme_admin": false, 00:14:44.521 "nvme_io": false 00:14:44.521 }, 00:14:44.521 "memory_domains": [ 00:14:44.521 { 00:14:44.521 "dma_device_id": "system", 00:14:44.521 "dma_device_type": 1 00:14:44.521 }, 00:14:44.521 { 00:14:44.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.521 "dma_device_type": 2 00:14:44.521 } 00:14:44.521 ], 00:14:44.521 "driver_specific": { 00:14:44.521 "passthru": { 00:14:44.521 "name": "pt2", 00:14:44.521 "base_bdev_name": "malloc2" 00:14:44.521 } 00:14:44.521 } 00:14:44.521 }' 00:14:44.521 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.521 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.521 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.521 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.521 13:42:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:44.781 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.042 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.042 "name": "pt3", 00:14:45.042 "aliases": [ 00:14:45.042 "00000000-0000-0000-0000-000000000003" 00:14:45.042 ], 00:14:45.042 "product_name": "passthru", 00:14:45.042 "block_size": 512, 00:14:45.042 "num_blocks": 65536, 00:14:45.042 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:45.042 "assigned_rate_limits": { 00:14:45.042 "rw_ios_per_sec": 0, 00:14:45.042 "rw_mbytes_per_sec": 0, 00:14:45.042 "r_mbytes_per_sec": 0, 00:14:45.042 "w_mbytes_per_sec": 0 00:14:45.042 }, 00:14:45.042 "claimed": true, 00:14:45.042 "claim_type": "exclusive_write", 00:14:45.042 "zoned": false, 00:14:45.042 "supported_io_types": { 00:14:45.042 "read": true, 00:14:45.042 "write": true, 00:14:45.042 "unmap": true, 00:14:45.042 "write_zeroes": true, 00:14:45.042 "flush": true, 00:14:45.042 "reset": true, 00:14:45.042 "compare": false, 00:14:45.042 "compare_and_write": false, 00:14:45.042 "abort": true, 00:14:45.042 "nvme_admin": false, 00:14:45.042 "nvme_io": false 00:14:45.042 }, 00:14:45.042 "memory_domains": [ 00:14:45.042 { 00:14:45.042 "dma_device_id": "system", 00:14:45.042 "dma_device_type": 1 00:14:45.042 }, 00:14:45.042 { 00:14:45.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.042 "dma_device_type": 2 00:14:45.042 } 00:14:45.042 ], 00:14:45.042 "driver_specific": { 00:14:45.042 "passthru": { 00:14:45.042 "name": "pt3", 00:14:45.042 "base_bdev_name": "malloc3" 00:14:45.042 } 00:14:45.042 } 00:14:45.042 }' 00:14:45.042 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.042 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.042 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.042 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:45.302 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:45.561 [2024-06-10 13:42:59.954489] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:45.561 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ecc8b44c-472f-42a0-8d55-9a263723ed98 '!=' ecc8b44c-472f-42a0-8d55-9a263723ed98 ']' 00:14:45.561 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:14:45.561 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:45.561 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:45.561 13:42:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:45.821 [2024-06-10 13:43:00.155052] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.821 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:46.081 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.081 "name": "raid_bdev1", 00:14:46.081 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:46.081 "strip_size_kb": 0, 00:14:46.081 "state": "online", 00:14:46.081 "raid_level": "raid1", 00:14:46.081 "superblock": true, 00:14:46.081 "num_base_bdevs": 3, 00:14:46.081 "num_base_bdevs_discovered": 2, 00:14:46.081 "num_base_bdevs_operational": 2, 00:14:46.081 "base_bdevs_list": [ 00:14:46.081 { 00:14:46.081 "name": null, 00:14:46.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.081 "is_configured": false, 00:14:46.081 "data_offset": 2048, 00:14:46.081 "data_size": 63488 00:14:46.081 }, 00:14:46.081 { 00:14:46.081 "name": "pt2", 00:14:46.081 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:46.081 "is_configured": true, 00:14:46.081 "data_offset": 2048, 00:14:46.081 "data_size": 63488 00:14:46.081 }, 00:14:46.081 { 00:14:46.081 "name": "pt3", 00:14:46.081 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:46.081 "is_configured": true, 00:14:46.081 "data_offset": 2048, 00:14:46.081 "data_size": 63488 00:14:46.081 } 00:14:46.081 ] 00:14:46.081 }' 00:14:46.081 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.081 13:43:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.651 13:43:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:46.651 [2024-06-10 13:43:01.077360] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:46.651 [2024-06-10 13:43:01.077381] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:46.651 [2024-06-10 13:43:01.077423] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:46.651 [2024-06-10 13:43:01.077468] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:46.651 [2024-06-10 13:43:01.077474] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c93f0 name raid_bdev1, state offline 00:14:46.651 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.651 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:14:46.911 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:14:46.911 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:14:46.911 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:14:46.911 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:46.911 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:47.172 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:47.172 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:47.172 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:47.432 [2024-06-10 13:43:01.879355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:47.432 [2024-06-10 13:43:01.879383] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:47.432 [2024-06-10 13:43:01.879396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d0f10 00:14:47.432 [2024-06-10 13:43:01.879403] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:47.432 [2024-06-10 13:43:01.880749] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:47.432 [2024-06-10 13:43:01.880770] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:47.432 [2024-06-10 13:43:01.880816] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:47.432 [2024-06-10 13:43:01.880834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:47.432 pt2 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.432 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.433 13:43:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:47.693 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.693 "name": "raid_bdev1", 00:14:47.693 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:47.693 "strip_size_kb": 0, 00:14:47.693 "state": "configuring", 00:14:47.693 "raid_level": "raid1", 00:14:47.693 "superblock": true, 00:14:47.693 "num_base_bdevs": 3, 00:14:47.693 "num_base_bdevs_discovered": 1, 00:14:47.693 "num_base_bdevs_operational": 2, 00:14:47.693 "base_bdevs_list": [ 00:14:47.693 { 00:14:47.693 "name": null, 00:14:47.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.693 "is_configured": false, 00:14:47.693 "data_offset": 2048, 00:14:47.693 "data_size": 63488 00:14:47.693 }, 00:14:47.693 { 00:14:47.693 "name": "pt2", 00:14:47.693 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:47.693 "is_configured": true, 00:14:47.693 "data_offset": 2048, 00:14:47.693 "data_size": 63488 00:14:47.693 }, 00:14:47.693 { 00:14:47.693 "name": null, 00:14:47.693 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:47.693 "is_configured": false, 00:14:47.693 "data_offset": 2048, 00:14:47.693 "data_size": 63488 00:14:47.693 } 00:14:47.693 ] 00:14:47.693 }' 00:14:47.693 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.693 13:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.261 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:14:48.261 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:48.261 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:14:48.261 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:48.520 [2024-06-10 13:43:02.845800] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:48.520 [2024-06-10 13:43:02.845833] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:48.520 [2024-06-10 13:43:02.845844] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26c8730 00:14:48.520 [2024-06-10 13:43:02.845850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:48.520 [2024-06-10 13:43:02.846128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:48.520 [2024-06-10 13:43:02.846141] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:48.520 [2024-06-10 13:43:02.846192] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:48.520 [2024-06-10 13:43:02.846204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:48.520 [2024-06-10 13:43:02.846281] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c7480 00:14:48.520 [2024-06-10 13:43:02.846288] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:48.520 [2024-06-10 13:43:02.846436] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26d0220 00:14:48.520 [2024-06-10 13:43:02.846542] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c7480 00:14:48.520 [2024-06-10 13:43:02.846548] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c7480 00:14:48.520 [2024-06-10 13:43:02.846624] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:48.520 pt3 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.520 13:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:48.780 13:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.780 "name": "raid_bdev1", 00:14:48.780 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:48.780 "strip_size_kb": 0, 00:14:48.780 "state": "online", 00:14:48.780 "raid_level": "raid1", 00:14:48.780 "superblock": true, 00:14:48.780 "num_base_bdevs": 3, 00:14:48.780 "num_base_bdevs_discovered": 2, 00:14:48.780 "num_base_bdevs_operational": 2, 00:14:48.780 "base_bdevs_list": [ 00:14:48.780 { 00:14:48.780 "name": null, 00:14:48.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:48.780 "is_configured": false, 00:14:48.780 "data_offset": 2048, 00:14:48.780 "data_size": 63488 00:14:48.780 }, 00:14:48.780 { 00:14:48.780 "name": "pt2", 00:14:48.780 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:48.780 "is_configured": true, 00:14:48.780 "data_offset": 2048, 00:14:48.780 "data_size": 63488 00:14:48.780 }, 00:14:48.780 { 00:14:48.780 "name": "pt3", 00:14:48.780 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:48.780 "is_configured": true, 00:14:48.780 "data_offset": 2048, 00:14:48.780 "data_size": 63488 00:14:48.780 } 00:14:48.780 ] 00:14:48.780 }' 00:14:48.780 13:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.780 13:43:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.349 13:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:49.349 [2024-06-10 13:43:03.820263] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:49.349 [2024-06-10 13:43:03.820277] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:49.349 [2024-06-10 13:43:03.820312] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:49.349 [2024-06-10 13:43:03.820355] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:49.349 [2024-06-10 13:43:03.820361] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c7480 name raid_bdev1, state offline 00:14:49.609 13:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.609 13:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:14:49.609 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:14:49.609 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:14:49.609 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:14:49.609 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:14:49.609 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:49.869 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:50.129 [2024-06-10 13:43:04.429783] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:50.129 [2024-06-10 13:43:04.429811] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.129 [2024-06-10 13:43:04.429821] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2792c10 00:14:50.129 [2024-06-10 13:43:04.429828] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.129 [2024-06-10 13:43:04.431188] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.129 [2024-06-10 13:43:04.431207] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:50.129 [2024-06-10 13:43:04.431254] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:50.129 [2024-06-10 13:43:04.431271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:50.129 [2024-06-10 13:43:04.431345] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:14:50.129 [2024-06-10 13:43:04.431352] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:50.129 [2024-06-10 13:43:04.431368] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c7700 name raid_bdev1, state configuring 00:14:50.129 [2024-06-10 13:43:04.431384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:50.129 pt1 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.129 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:50.389 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.389 "name": "raid_bdev1", 00:14:50.389 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:50.389 "strip_size_kb": 0, 00:14:50.389 "state": "configuring", 00:14:50.389 "raid_level": "raid1", 00:14:50.389 "superblock": true, 00:14:50.389 "num_base_bdevs": 3, 00:14:50.389 "num_base_bdevs_discovered": 1, 00:14:50.389 "num_base_bdevs_operational": 2, 00:14:50.389 "base_bdevs_list": [ 00:14:50.389 { 00:14:50.389 "name": null, 00:14:50.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.389 "is_configured": false, 00:14:50.389 "data_offset": 2048, 00:14:50.389 "data_size": 63488 00:14:50.389 }, 00:14:50.389 { 00:14:50.389 "name": "pt2", 00:14:50.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:50.389 "is_configured": true, 00:14:50.389 "data_offset": 2048, 00:14:50.389 "data_size": 63488 00:14:50.389 }, 00:14:50.389 { 00:14:50.389 "name": null, 00:14:50.389 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:50.389 "is_configured": false, 00:14:50.389 "data_offset": 2048, 00:14:50.389 "data_size": 63488 00:14:50.389 } 00:14:50.389 ] 00:14:50.389 }' 00:14:50.389 13:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.389 13:43:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.959 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:50.959 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:14:50.959 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:14:50.959 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:51.219 [2024-06-10 13:43:05.568681] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:51.219 [2024-06-10 13:43:05.568707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:51.219 [2024-06-10 13:43:05.568719] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2793d60 00:14:51.219 [2024-06-10 13:43:05.568726] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:51.219 [2024-06-10 13:43:05.568993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:51.219 [2024-06-10 13:43:05.569004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:51.219 [2024-06-10 13:43:05.569046] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:51.219 [2024-06-10 13:43:05.569061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:51.219 [2024-06-10 13:43:05.569136] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c6ea0 00:14:51.219 [2024-06-10 13:43:05.569142] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:51.219 [2024-06-10 13:43:05.569292] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2882470 00:14:51.219 [2024-06-10 13:43:05.569397] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c6ea0 00:14:51.219 [2024-06-10 13:43:05.569402] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c6ea0 00:14:51.219 [2024-06-10 13:43:05.569481] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:51.219 pt3 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.219 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:51.479 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.479 "name": "raid_bdev1", 00:14:51.479 "uuid": "ecc8b44c-472f-42a0-8d55-9a263723ed98", 00:14:51.479 "strip_size_kb": 0, 00:14:51.479 "state": "online", 00:14:51.479 "raid_level": "raid1", 00:14:51.479 "superblock": true, 00:14:51.479 "num_base_bdevs": 3, 00:14:51.479 "num_base_bdevs_discovered": 2, 00:14:51.479 "num_base_bdevs_operational": 2, 00:14:51.480 "base_bdevs_list": [ 00:14:51.480 { 00:14:51.480 "name": null, 00:14:51.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.480 "is_configured": false, 00:14:51.480 "data_offset": 2048, 00:14:51.480 "data_size": 63488 00:14:51.480 }, 00:14:51.480 { 00:14:51.480 "name": "pt2", 00:14:51.480 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:51.480 "is_configured": true, 00:14:51.480 "data_offset": 2048, 00:14:51.480 "data_size": 63488 00:14:51.480 }, 00:14:51.480 { 00:14:51.480 "name": "pt3", 00:14:51.480 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:51.480 "is_configured": true, 00:14:51.480 "data_offset": 2048, 00:14:51.480 "data_size": 63488 00:14:51.480 } 00:14:51.480 ] 00:14:51.480 }' 00:14:51.480 13:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.480 13:43:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.050 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:52.050 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:14:52.310 [2024-06-10 13:43:06.723796] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' ecc8b44c-472f-42a0-8d55-9a263723ed98 '!=' ecc8b44c-472f-42a0-8d55-9a263723ed98 ']' 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1556673 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1556673 ']' 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1556673 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:52.310 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1556673 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1556673' 00:14:52.570 killing process with pid 1556673 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1556673 00:14:52.570 [2024-06-10 13:43:06.794384] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:52.570 [2024-06-10 13:43:06.794424] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:52.570 [2024-06-10 13:43:06.794466] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:52.570 [2024-06-10 13:43:06.794472] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c6ea0 name raid_bdev1, state offline 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1556673 00:14:52.570 [2024-06-10 13:43:06.809766] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:52.570 00:14:52.570 real 0m19.085s 00:14:52.570 user 0m35.577s 00:14:52.570 sys 0m2.887s 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:52.570 13:43:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.570 ************************************ 00:14:52.570 END TEST raid_superblock_test 00:14:52.570 ************************************ 00:14:52.570 13:43:06 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:14:52.570 13:43:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:52.570 13:43:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:52.570 13:43:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:52.570 ************************************ 00:14:52.570 START TEST raid_read_error_test 00:14:52.570 ************************************ 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 read 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.au0xP9NHMJ 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1560765 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1560765 /var/tmp/spdk-raid.sock 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1560765 ']' 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:52.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:52.570 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.831 [2024-06-10 13:43:07.092536] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:14:52.831 [2024-06-10 13:43:07.092589] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560765 ] 00:14:52.831 [2024-06-10 13:43:07.182942] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:52.831 [2024-06-10 13:43:07.261016] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:14:53.091 [2024-06-10 13:43:07.323726] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:53.091 [2024-06-10 13:43:07.323752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:53.661 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:53.661 13:43:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:14:53.661 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:53.661 13:43:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:53.921 BaseBdev1_malloc 00:14:53.921 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:53.921 true 00:14:53.921 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:54.180 [2024-06-10 13:43:08.547959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:54.180 [2024-06-10 13:43:08.547990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.180 [2024-06-10 13:43:08.548002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2327c90 00:14:54.180 [2024-06-10 13:43:08.548014] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.180 [2024-06-10 13:43:08.549475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.180 [2024-06-10 13:43:08.549496] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:54.180 BaseBdev1 00:14:54.180 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:54.180 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:54.440 BaseBdev2_malloc 00:14:54.440 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:54.700 true 00:14:54.700 13:43:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:54.700 [2024-06-10 13:43:09.151533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:54.700 [2024-06-10 13:43:09.151563] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.700 [2024-06-10 13:43:09.151577] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232c400 00:14:54.700 [2024-06-10 13:43:09.151584] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.700 [2024-06-10 13:43:09.152853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.700 [2024-06-10 13:43:09.152873] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:54.700 BaseBdev2 00:14:54.700 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:54.700 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:54.960 BaseBdev3_malloc 00:14:54.960 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:55.220 true 00:14:55.220 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:55.479 [2024-06-10 13:43:09.755123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:55.479 [2024-06-10 13:43:09.755151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.479 [2024-06-10 13:43:09.755169] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232efc0 00:14:55.479 [2024-06-10 13:43:09.755176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.479 [2024-06-10 13:43:09.756435] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.479 [2024-06-10 13:43:09.756455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:55.479 BaseBdev3 00:14:55.479 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:55.739 [2024-06-10 13:43:09.955651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:55.739 [2024-06-10 13:43:09.956744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:55.739 [2024-06-10 13:43:09.956801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:55.739 [2024-06-10 13:43:09.956970] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x232d060 00:14:55.739 [2024-06-10 13:43:09.956978] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:55.739 [2024-06-10 13:43:09.957130] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217dea0 00:14:55.739 [2024-06-10 13:43:09.957263] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x232d060 00:14:55.739 [2024-06-10 13:43:09.957270] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x232d060 00:14:55.739 [2024-06-10 13:43:09.957350] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.739 13:43:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:55.739 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.739 "name": "raid_bdev1", 00:14:55.739 "uuid": "49a2e575-9e2e-4750-9189-57117e054591", 00:14:55.739 "strip_size_kb": 0, 00:14:55.739 "state": "online", 00:14:55.739 "raid_level": "raid1", 00:14:55.739 "superblock": true, 00:14:55.739 "num_base_bdevs": 3, 00:14:55.739 "num_base_bdevs_discovered": 3, 00:14:55.739 "num_base_bdevs_operational": 3, 00:14:55.739 "base_bdevs_list": [ 00:14:55.739 { 00:14:55.739 "name": "BaseBdev1", 00:14:55.739 "uuid": "e8fc3468-59a2-542d-89f1-25f32835bb50", 00:14:55.739 "is_configured": true, 00:14:55.739 "data_offset": 2048, 00:14:55.739 "data_size": 63488 00:14:55.739 }, 00:14:55.739 { 00:14:55.739 "name": "BaseBdev2", 00:14:55.739 "uuid": "f0d3e25d-84d7-5b44-8e22-43112f80df10", 00:14:55.739 "is_configured": true, 00:14:55.739 "data_offset": 2048, 00:14:55.739 "data_size": 63488 00:14:55.739 }, 00:14:55.739 { 00:14:55.739 "name": "BaseBdev3", 00:14:55.739 "uuid": "34a35df1-cd2a-573b-af81-bb24b29a9be9", 00:14:55.739 "is_configured": true, 00:14:55.739 "data_offset": 2048, 00:14:55.739 "data_size": 63488 00:14:55.739 } 00:14:55.739 ] 00:14:55.739 }' 00:14:55.739 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.739 13:43:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.320 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:56.320 13:43:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:56.630 [2024-06-10 13:43:10.802082] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217d9e0 00:14:57.245 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.505 13:43:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:57.765 13:43:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.765 "name": "raid_bdev1", 00:14:57.765 "uuid": "49a2e575-9e2e-4750-9189-57117e054591", 00:14:57.765 "strip_size_kb": 0, 00:14:57.765 "state": "online", 00:14:57.765 "raid_level": "raid1", 00:14:57.765 "superblock": true, 00:14:57.765 "num_base_bdevs": 3, 00:14:57.765 "num_base_bdevs_discovered": 3, 00:14:57.765 "num_base_bdevs_operational": 3, 00:14:57.765 "base_bdevs_list": [ 00:14:57.765 { 00:14:57.765 "name": "BaseBdev1", 00:14:57.765 "uuid": "e8fc3468-59a2-542d-89f1-25f32835bb50", 00:14:57.765 "is_configured": true, 00:14:57.765 "data_offset": 2048, 00:14:57.765 "data_size": 63488 00:14:57.765 }, 00:14:57.765 { 00:14:57.765 "name": "BaseBdev2", 00:14:57.765 "uuid": "f0d3e25d-84d7-5b44-8e22-43112f80df10", 00:14:57.765 "is_configured": true, 00:14:57.765 "data_offset": 2048, 00:14:57.765 "data_size": 63488 00:14:57.765 }, 00:14:57.765 { 00:14:57.765 "name": "BaseBdev3", 00:14:57.765 "uuid": "34a35df1-cd2a-573b-af81-bb24b29a9be9", 00:14:57.765 "is_configured": true, 00:14:57.766 "data_offset": 2048, 00:14:57.766 "data_size": 63488 00:14:57.766 } 00:14:57.766 ] 00:14:57.766 }' 00:14:57.766 13:43:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.766 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.336 13:43:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:58.597 [2024-06-10 13:43:12.884431] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:58.597 [2024-06-10 13:43:12.884459] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:58.597 [2024-06-10 13:43:12.887279] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:58.597 [2024-06-10 13:43:12.887305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:58.597 [2024-06-10 13:43:12.887386] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:58.597 [2024-06-10 13:43:12.887393] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x232d060 name raid_bdev1, state offline 00:14:58.597 0 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1560765 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1560765 ']' 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1560765 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1560765 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1560765' 00:14:58.597 killing process with pid 1560765 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1560765 00:14:58.597 [2024-06-10 13:43:12.955186] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:58.597 13:43:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1560765 00:14:58.597 [2024-06-10 13:43:12.966469] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.au0xP9NHMJ 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:58.858 00:14:58.858 real 0m6.095s 00:14:58.858 user 0m9.752s 00:14:58.858 sys 0m0.866s 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:58.858 13:43:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.858 ************************************ 00:14:58.858 END TEST raid_read_error_test 00:14:58.858 ************************************ 00:14:58.858 13:43:13 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:14:58.858 13:43:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:58.858 13:43:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:58.858 13:43:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:58.858 ************************************ 00:14:58.858 START TEST raid_write_error_test 00:14:58.858 ************************************ 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 write 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:58.858 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6GYYLpuwk6 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1562177 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1562177 /var/tmp/spdk-raid.sock 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1562177 ']' 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:58.859 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:58.859 13:43:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.859 [2024-06-10 13:43:13.248905] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:14:58.859 [2024-06-10 13:43:13.248948] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562177 ] 00:14:59.119 [2024-06-10 13:43:13.336061] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:59.119 [2024-06-10 13:43:13.400773] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:14:59.119 [2024-06-10 13:43:13.445402] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:59.119 [2024-06-10 13:43:13.445427] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:59.689 13:43:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:59.689 13:43:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:14:59.689 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:59.689 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:59.949 BaseBdev1_malloc 00:14:59.949 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:00.209 true 00:15:00.209 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:00.209 [2024-06-10 13:43:14.656956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:00.209 [2024-06-10 13:43:14.656990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.209 [2024-06-10 13:43:14.657002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f15c90 00:15:00.209 [2024-06-10 13:43:14.657009] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.209 [2024-06-10 13:43:14.658438] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.209 [2024-06-10 13:43:14.658459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:00.209 BaseBdev1 00:15:00.209 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:00.209 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:00.469 BaseBdev2_malloc 00:15:00.469 13:43:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:00.729 true 00:15:00.729 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:00.990 [2024-06-10 13:43:15.248352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:00.990 [2024-06-10 13:43:15.248379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.990 [2024-06-10 13:43:15.248390] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f1a400 00:15:00.990 [2024-06-10 13:43:15.248396] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.990 [2024-06-10 13:43:15.249631] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.990 [2024-06-10 13:43:15.249649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:00.990 BaseBdev2 00:15:00.990 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:00.990 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:00.990 BaseBdev3_malloc 00:15:00.990 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:01.251 true 00:15:01.251 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:01.512 [2024-06-10 13:43:15.839670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:01.512 [2024-06-10 13:43:15.839698] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.512 [2024-06-10 13:43:15.839712] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f1cfc0 00:15:01.512 [2024-06-10 13:43:15.839719] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.512 [2024-06-10 13:43:15.840983] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.512 [2024-06-10 13:43:15.841004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:01.512 BaseBdev3 00:15:01.512 13:43:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:01.773 [2024-06-10 13:43:15.992080] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:01.773 [2024-06-10 13:43:15.993149] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:01.773 [2024-06-10 13:43:15.993209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:01.773 [2024-06-10 13:43:15.993382] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f1b060 00:15:01.773 [2024-06-10 13:43:15.993389] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:01.773 [2024-06-10 13:43:15.993541] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6bea0 00:15:01.773 [2024-06-10 13:43:15.993664] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f1b060 00:15:01.773 [2024-06-10 13:43:15.993670] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f1b060 00:15:01.773 [2024-06-10 13:43:15.993749] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.773 "name": "raid_bdev1", 00:15:01.773 "uuid": "a0b4a0e1-1a04-43eb-9988-0bc506695bd8", 00:15:01.773 "strip_size_kb": 0, 00:15:01.773 "state": "online", 00:15:01.773 "raid_level": "raid1", 00:15:01.773 "superblock": true, 00:15:01.773 "num_base_bdevs": 3, 00:15:01.773 "num_base_bdevs_discovered": 3, 00:15:01.773 "num_base_bdevs_operational": 3, 00:15:01.773 "base_bdevs_list": [ 00:15:01.773 { 00:15:01.773 "name": "BaseBdev1", 00:15:01.773 "uuid": "f40e0a50-73df-51be-83c7-5c43c49a157a", 00:15:01.773 "is_configured": true, 00:15:01.773 "data_offset": 2048, 00:15:01.773 "data_size": 63488 00:15:01.773 }, 00:15:01.773 { 00:15:01.773 "name": "BaseBdev2", 00:15:01.773 "uuid": "1c5e5650-5539-5d8d-b6a3-453951e532dd", 00:15:01.773 "is_configured": true, 00:15:01.773 "data_offset": 2048, 00:15:01.773 "data_size": 63488 00:15:01.773 }, 00:15:01.773 { 00:15:01.773 "name": "BaseBdev3", 00:15:01.773 "uuid": "a000e808-036a-5f06-9c41-941874a3210c", 00:15:01.773 "is_configured": true, 00:15:01.773 "data_offset": 2048, 00:15:01.773 "data_size": 63488 00:15:01.773 } 00:15:01.773 ] 00:15:01.773 }' 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.773 13:43:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.344 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:02.344 13:43:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:02.605 [2024-06-10 13:43:16.878542] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6b9e0 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:03.573 [2024-06-10 13:43:17.978835] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:15:03.573 [2024-06-10 13:43:17.978875] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:03.573 [2024-06-10 13:43:17.979057] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d6b9e0 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.573 13:43:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.573 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.573 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.834 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.834 "name": "raid_bdev1", 00:15:03.834 "uuid": "a0b4a0e1-1a04-43eb-9988-0bc506695bd8", 00:15:03.834 "strip_size_kb": 0, 00:15:03.834 "state": "online", 00:15:03.834 "raid_level": "raid1", 00:15:03.834 "superblock": true, 00:15:03.834 "num_base_bdevs": 3, 00:15:03.834 "num_base_bdevs_discovered": 2, 00:15:03.834 "num_base_bdevs_operational": 2, 00:15:03.834 "base_bdevs_list": [ 00:15:03.834 { 00:15:03.834 "name": null, 00:15:03.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.834 "is_configured": false, 00:15:03.834 "data_offset": 2048, 00:15:03.834 "data_size": 63488 00:15:03.834 }, 00:15:03.834 { 00:15:03.834 "name": "BaseBdev2", 00:15:03.834 "uuid": "1c5e5650-5539-5d8d-b6a3-453951e532dd", 00:15:03.834 "is_configured": true, 00:15:03.834 "data_offset": 2048, 00:15:03.834 "data_size": 63488 00:15:03.834 }, 00:15:03.834 { 00:15:03.834 "name": "BaseBdev3", 00:15:03.834 "uuid": "a000e808-036a-5f06-9c41-941874a3210c", 00:15:03.834 "is_configured": true, 00:15:03.834 "data_offset": 2048, 00:15:03.834 "data_size": 63488 00:15:03.834 } 00:15:03.834 ] 00:15:03.834 }' 00:15:03.834 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.834 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.405 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:04.666 [2024-06-10 13:43:18.946627] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:04.666 [2024-06-10 13:43:18.946660] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:04.666 [2024-06-10 13:43:18.949437] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:04.666 [2024-06-10 13:43:18.949460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.666 [2024-06-10 13:43:18.949520] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:04.666 [2024-06-10 13:43:18.949526] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1b060 name raid_bdev1, state offline 00:15:04.666 0 00:15:04.666 13:43:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1562177 00:15:04.666 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1562177 ']' 00:15:04.666 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1562177 00:15:04.666 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:15:04.666 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:04.666 13:43:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1562177 00:15:04.666 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:04.666 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:04.666 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1562177' 00:15:04.666 killing process with pid 1562177 00:15:04.666 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1562177 00:15:04.666 [2024-06-10 13:43:19.017197] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:04.666 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1562177 00:15:04.666 [2024-06-10 13:43:19.028446] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6GYYLpuwk6 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:04.927 00:15:04.927 real 0m5.986s 00:15:04.927 user 0m9.561s 00:15:04.927 sys 0m0.837s 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:04.927 13:43:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.927 ************************************ 00:15:04.927 END TEST raid_write_error_test 00:15:04.927 ************************************ 00:15:04.927 13:43:19 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:15:04.927 13:43:19 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:04.927 13:43:19 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:15:04.927 13:43:19 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:04.927 13:43:19 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:04.927 13:43:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:04.927 ************************************ 00:15:04.927 START TEST raid_state_function_test 00:15:04.927 ************************************ 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 false 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:04.927 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1563368 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1563368' 00:15:04.928 Process raid pid: 1563368 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1563368 /var/tmp/spdk-raid.sock 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1563368 ']' 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:04.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:04.928 13:43:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.928 [2024-06-10 13:43:19.303629] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:15:04.928 [2024-06-10 13:43:19.303679] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:04.928 [2024-06-10 13:43:19.396096] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:05.188 [2024-06-10 13:43:19.471216] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:15:05.188 [2024-06-10 13:43:19.516684] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:05.188 [2024-06-10 13:43:19.516706] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:05.757 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:05.757 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:15:05.757 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:06.016 [2024-06-10 13:43:20.345601] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:06.017 [2024-06-10 13:43:20.345638] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:06.017 [2024-06-10 13:43:20.345645] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:06.017 [2024-06-10 13:43:20.345652] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:06.017 [2024-06-10 13:43:20.345657] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:06.017 [2024-06-10 13:43:20.345663] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:06.017 [2024-06-10 13:43:20.345668] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:06.017 [2024-06-10 13:43:20.345674] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.017 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.277 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.277 "name": "Existed_Raid", 00:15:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.277 "strip_size_kb": 64, 00:15:06.277 "state": "configuring", 00:15:06.277 "raid_level": "raid0", 00:15:06.277 "superblock": false, 00:15:06.277 "num_base_bdevs": 4, 00:15:06.277 "num_base_bdevs_discovered": 0, 00:15:06.277 "num_base_bdevs_operational": 4, 00:15:06.277 "base_bdevs_list": [ 00:15:06.277 { 00:15:06.277 "name": "BaseBdev1", 00:15:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.277 "is_configured": false, 00:15:06.277 "data_offset": 0, 00:15:06.277 "data_size": 0 00:15:06.277 }, 00:15:06.277 { 00:15:06.277 "name": "BaseBdev2", 00:15:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.277 "is_configured": false, 00:15:06.277 "data_offset": 0, 00:15:06.277 "data_size": 0 00:15:06.277 }, 00:15:06.277 { 00:15:06.277 "name": "BaseBdev3", 00:15:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.277 "is_configured": false, 00:15:06.277 "data_offset": 0, 00:15:06.277 "data_size": 0 00:15:06.277 }, 00:15:06.277 { 00:15:06.277 "name": "BaseBdev4", 00:15:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.277 "is_configured": false, 00:15:06.277 "data_offset": 0, 00:15:06.277 "data_size": 0 00:15:06.277 } 00:15:06.277 ] 00:15:06.277 }' 00:15:06.277 13:43:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.277 13:43:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.849 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:06.849 [2024-06-10 13:43:21.323954] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:06.849 [2024-06-10 13:43:21.323974] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23ff890 name Existed_Raid, state configuring 00:15:07.109 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:07.109 [2024-06-10 13:43:21.468345] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:07.109 [2024-06-10 13:43:21.468363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:07.109 [2024-06-10 13:43:21.468368] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.109 [2024-06-10 13:43:21.468374] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.109 [2024-06-10 13:43:21.468380] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:07.109 [2024-06-10 13:43:21.468386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:07.109 [2024-06-10 13:43:21.468390] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:07.109 [2024-06-10 13:43:21.468396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:07.109 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:07.370 [2024-06-10 13:43:21.679697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:07.370 BaseBdev1 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:07.370 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:07.630 13:43:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:07.630 [ 00:15:07.630 { 00:15:07.630 "name": "BaseBdev1", 00:15:07.630 "aliases": [ 00:15:07.630 "aec2250c-d8ac-4bf0-97da-175d05cbe95e" 00:15:07.630 ], 00:15:07.630 "product_name": "Malloc disk", 00:15:07.630 "block_size": 512, 00:15:07.630 "num_blocks": 65536, 00:15:07.631 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:07.631 "assigned_rate_limits": { 00:15:07.631 "rw_ios_per_sec": 0, 00:15:07.631 "rw_mbytes_per_sec": 0, 00:15:07.631 "r_mbytes_per_sec": 0, 00:15:07.631 "w_mbytes_per_sec": 0 00:15:07.631 }, 00:15:07.631 "claimed": true, 00:15:07.631 "claim_type": "exclusive_write", 00:15:07.631 "zoned": false, 00:15:07.631 "supported_io_types": { 00:15:07.631 "read": true, 00:15:07.631 "write": true, 00:15:07.631 "unmap": true, 00:15:07.631 "write_zeroes": true, 00:15:07.631 "flush": true, 00:15:07.631 "reset": true, 00:15:07.631 "compare": false, 00:15:07.631 "compare_and_write": false, 00:15:07.631 "abort": true, 00:15:07.631 "nvme_admin": false, 00:15:07.631 "nvme_io": false 00:15:07.631 }, 00:15:07.631 "memory_domains": [ 00:15:07.631 { 00:15:07.631 "dma_device_id": "system", 00:15:07.631 "dma_device_type": 1 00:15:07.631 }, 00:15:07.631 { 00:15:07.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.631 "dma_device_type": 2 00:15:07.631 } 00:15:07.631 ], 00:15:07.631 "driver_specific": {} 00:15:07.631 } 00:15:07.631 ] 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.631 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.891 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.891 "name": "Existed_Raid", 00:15:07.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.891 "strip_size_kb": 64, 00:15:07.891 "state": "configuring", 00:15:07.891 "raid_level": "raid0", 00:15:07.891 "superblock": false, 00:15:07.891 "num_base_bdevs": 4, 00:15:07.891 "num_base_bdevs_discovered": 1, 00:15:07.891 "num_base_bdevs_operational": 4, 00:15:07.891 "base_bdevs_list": [ 00:15:07.891 { 00:15:07.891 "name": "BaseBdev1", 00:15:07.891 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:07.891 "is_configured": true, 00:15:07.891 "data_offset": 0, 00:15:07.891 "data_size": 65536 00:15:07.891 }, 00:15:07.891 { 00:15:07.891 "name": "BaseBdev2", 00:15:07.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.891 "is_configured": false, 00:15:07.891 "data_offset": 0, 00:15:07.891 "data_size": 0 00:15:07.891 }, 00:15:07.891 { 00:15:07.891 "name": "BaseBdev3", 00:15:07.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.891 "is_configured": false, 00:15:07.891 "data_offset": 0, 00:15:07.891 "data_size": 0 00:15:07.891 }, 00:15:07.891 { 00:15:07.891 "name": "BaseBdev4", 00:15:07.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.891 "is_configured": false, 00:15:07.891 "data_offset": 0, 00:15:07.891 "data_size": 0 00:15:07.891 } 00:15:07.891 ] 00:15:07.891 }' 00:15:07.891 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.891 13:43:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.462 13:43:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:08.723 [2024-06-10 13:43:23.043254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:08.723 [2024-06-10 13:43:23.043283] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23ff100 name Existed_Raid, state configuring 00:15:08.723 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:08.983 [2024-06-10 13:43:23.235775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:08.983 [2024-06-10 13:43:23.237005] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:08.983 [2024-06-10 13:43:23.237029] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:08.983 [2024-06-10 13:43:23.237035] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:08.983 [2024-06-10 13:43:23.237041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:08.983 [2024-06-10 13:43:23.237046] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:08.983 [2024-06-10 13:43:23.237052] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:08.983 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:08.983 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:08.983 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.984 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.244 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.244 "name": "Existed_Raid", 00:15:09.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.244 "strip_size_kb": 64, 00:15:09.244 "state": "configuring", 00:15:09.244 "raid_level": "raid0", 00:15:09.244 "superblock": false, 00:15:09.244 "num_base_bdevs": 4, 00:15:09.244 "num_base_bdevs_discovered": 1, 00:15:09.244 "num_base_bdevs_operational": 4, 00:15:09.244 "base_bdevs_list": [ 00:15:09.244 { 00:15:09.244 "name": "BaseBdev1", 00:15:09.244 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:09.244 "is_configured": true, 00:15:09.244 "data_offset": 0, 00:15:09.244 "data_size": 65536 00:15:09.244 }, 00:15:09.244 { 00:15:09.244 "name": "BaseBdev2", 00:15:09.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.244 "is_configured": false, 00:15:09.244 "data_offset": 0, 00:15:09.244 "data_size": 0 00:15:09.244 }, 00:15:09.244 { 00:15:09.244 "name": "BaseBdev3", 00:15:09.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.244 "is_configured": false, 00:15:09.244 "data_offset": 0, 00:15:09.244 "data_size": 0 00:15:09.244 }, 00:15:09.244 { 00:15:09.244 "name": "BaseBdev4", 00:15:09.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.244 "is_configured": false, 00:15:09.244 "data_offset": 0, 00:15:09.244 "data_size": 0 00:15:09.244 } 00:15:09.244 ] 00:15:09.244 }' 00:15:09.244 13:43:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.244 13:43:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:09.814 [2024-06-10 13:43:24.199479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:09.814 BaseBdev2 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:09.814 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:10.074 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:10.336 [ 00:15:10.336 { 00:15:10.336 "name": "BaseBdev2", 00:15:10.336 "aliases": [ 00:15:10.336 "69ba6171-51b8-4d04-84f4-389995928cdb" 00:15:10.336 ], 00:15:10.336 "product_name": "Malloc disk", 00:15:10.336 "block_size": 512, 00:15:10.336 "num_blocks": 65536, 00:15:10.336 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:10.336 "assigned_rate_limits": { 00:15:10.336 "rw_ios_per_sec": 0, 00:15:10.336 "rw_mbytes_per_sec": 0, 00:15:10.336 "r_mbytes_per_sec": 0, 00:15:10.336 "w_mbytes_per_sec": 0 00:15:10.336 }, 00:15:10.336 "claimed": true, 00:15:10.336 "claim_type": "exclusive_write", 00:15:10.336 "zoned": false, 00:15:10.336 "supported_io_types": { 00:15:10.336 "read": true, 00:15:10.336 "write": true, 00:15:10.336 "unmap": true, 00:15:10.336 "write_zeroes": true, 00:15:10.336 "flush": true, 00:15:10.336 "reset": true, 00:15:10.336 "compare": false, 00:15:10.336 "compare_and_write": false, 00:15:10.336 "abort": true, 00:15:10.336 "nvme_admin": false, 00:15:10.336 "nvme_io": false 00:15:10.336 }, 00:15:10.336 "memory_domains": [ 00:15:10.336 { 00:15:10.336 "dma_device_id": "system", 00:15:10.336 "dma_device_type": 1 00:15:10.336 }, 00:15:10.336 { 00:15:10.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.336 "dma_device_type": 2 00:15:10.336 } 00:15:10.336 ], 00:15:10.336 "driver_specific": {} 00:15:10.336 } 00:15:10.336 ] 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.336 "name": "Existed_Raid", 00:15:10.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.336 "strip_size_kb": 64, 00:15:10.336 "state": "configuring", 00:15:10.336 "raid_level": "raid0", 00:15:10.336 "superblock": false, 00:15:10.336 "num_base_bdevs": 4, 00:15:10.336 "num_base_bdevs_discovered": 2, 00:15:10.336 "num_base_bdevs_operational": 4, 00:15:10.336 "base_bdevs_list": [ 00:15:10.336 { 00:15:10.336 "name": "BaseBdev1", 00:15:10.336 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:10.336 "is_configured": true, 00:15:10.336 "data_offset": 0, 00:15:10.336 "data_size": 65536 00:15:10.336 }, 00:15:10.336 { 00:15:10.336 "name": "BaseBdev2", 00:15:10.336 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:10.336 "is_configured": true, 00:15:10.336 "data_offset": 0, 00:15:10.336 "data_size": 65536 00:15:10.336 }, 00:15:10.336 { 00:15:10.336 "name": "BaseBdev3", 00:15:10.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.336 "is_configured": false, 00:15:10.336 "data_offset": 0, 00:15:10.336 "data_size": 0 00:15:10.336 }, 00:15:10.336 { 00:15:10.336 "name": "BaseBdev4", 00:15:10.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.336 "is_configured": false, 00:15:10.336 "data_offset": 0, 00:15:10.336 "data_size": 0 00:15:10.336 } 00:15:10.336 ] 00:15:10.336 }' 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.336 13:43:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.907 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:11.167 [2024-06-10 13:43:25.535987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:11.167 BaseBdev3 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:11.167 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:11.427 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:11.688 [ 00:15:11.688 { 00:15:11.688 "name": "BaseBdev3", 00:15:11.688 "aliases": [ 00:15:11.688 "c48456a9-970c-4dad-a22a-44e9afa25371" 00:15:11.688 ], 00:15:11.688 "product_name": "Malloc disk", 00:15:11.688 "block_size": 512, 00:15:11.688 "num_blocks": 65536, 00:15:11.688 "uuid": "c48456a9-970c-4dad-a22a-44e9afa25371", 00:15:11.688 "assigned_rate_limits": { 00:15:11.688 "rw_ios_per_sec": 0, 00:15:11.688 "rw_mbytes_per_sec": 0, 00:15:11.688 "r_mbytes_per_sec": 0, 00:15:11.688 "w_mbytes_per_sec": 0 00:15:11.688 }, 00:15:11.688 "claimed": true, 00:15:11.688 "claim_type": "exclusive_write", 00:15:11.688 "zoned": false, 00:15:11.688 "supported_io_types": { 00:15:11.688 "read": true, 00:15:11.688 "write": true, 00:15:11.688 "unmap": true, 00:15:11.688 "write_zeroes": true, 00:15:11.688 "flush": true, 00:15:11.688 "reset": true, 00:15:11.688 "compare": false, 00:15:11.688 "compare_and_write": false, 00:15:11.688 "abort": true, 00:15:11.688 "nvme_admin": false, 00:15:11.688 "nvme_io": false 00:15:11.688 }, 00:15:11.688 "memory_domains": [ 00:15:11.688 { 00:15:11.688 "dma_device_id": "system", 00:15:11.688 "dma_device_type": 1 00:15:11.688 }, 00:15:11.688 { 00:15:11.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.688 "dma_device_type": 2 00:15:11.688 } 00:15:11.688 ], 00:15:11.688 "driver_specific": {} 00:15:11.688 } 00:15:11.688 ] 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.688 13:43:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.688 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.688 "name": "Existed_Raid", 00:15:11.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.688 "strip_size_kb": 64, 00:15:11.688 "state": "configuring", 00:15:11.688 "raid_level": "raid0", 00:15:11.689 "superblock": false, 00:15:11.689 "num_base_bdevs": 4, 00:15:11.689 "num_base_bdevs_discovered": 3, 00:15:11.689 "num_base_bdevs_operational": 4, 00:15:11.689 "base_bdevs_list": [ 00:15:11.689 { 00:15:11.689 "name": "BaseBdev1", 00:15:11.689 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:11.689 "is_configured": true, 00:15:11.689 "data_offset": 0, 00:15:11.689 "data_size": 65536 00:15:11.689 }, 00:15:11.689 { 00:15:11.689 "name": "BaseBdev2", 00:15:11.689 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:11.689 "is_configured": true, 00:15:11.689 "data_offset": 0, 00:15:11.689 "data_size": 65536 00:15:11.689 }, 00:15:11.689 { 00:15:11.689 "name": "BaseBdev3", 00:15:11.689 "uuid": "c48456a9-970c-4dad-a22a-44e9afa25371", 00:15:11.689 "is_configured": true, 00:15:11.689 "data_offset": 0, 00:15:11.689 "data_size": 65536 00:15:11.689 }, 00:15:11.689 { 00:15:11.689 "name": "BaseBdev4", 00:15:11.689 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.689 "is_configured": false, 00:15:11.689 "data_offset": 0, 00:15:11.689 "data_size": 0 00:15:11.689 } 00:15:11.689 ] 00:15:11.689 }' 00:15:11.689 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.689 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.261 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:12.521 [2024-06-10 13:43:26.832323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:12.521 [2024-06-10 13:43:26.832346] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2400160 00:15:12.521 [2024-06-10 13:43:26.832350] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:12.521 [2024-06-10 13:43:26.832543] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ebf20 00:15:12.521 [2024-06-10 13:43:26.832641] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2400160 00:15:12.521 [2024-06-10 13:43:26.832647] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2400160 00:15:12.521 [2024-06-10 13:43:26.832774] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.521 BaseBdev4 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:12.521 13:43:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.781 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:12.781 [ 00:15:12.782 { 00:15:12.782 "name": "BaseBdev4", 00:15:12.782 "aliases": [ 00:15:12.782 "9cf91652-b910-4ec7-b030-c8db9a0566f0" 00:15:12.782 ], 00:15:12.782 "product_name": "Malloc disk", 00:15:12.782 "block_size": 512, 00:15:12.782 "num_blocks": 65536, 00:15:12.782 "uuid": "9cf91652-b910-4ec7-b030-c8db9a0566f0", 00:15:12.782 "assigned_rate_limits": { 00:15:12.782 "rw_ios_per_sec": 0, 00:15:12.782 "rw_mbytes_per_sec": 0, 00:15:12.782 "r_mbytes_per_sec": 0, 00:15:12.782 "w_mbytes_per_sec": 0 00:15:12.782 }, 00:15:12.782 "claimed": true, 00:15:12.782 "claim_type": "exclusive_write", 00:15:12.782 "zoned": false, 00:15:12.782 "supported_io_types": { 00:15:12.782 "read": true, 00:15:12.782 "write": true, 00:15:12.782 "unmap": true, 00:15:12.782 "write_zeroes": true, 00:15:12.782 "flush": true, 00:15:12.782 "reset": true, 00:15:12.782 "compare": false, 00:15:12.782 "compare_and_write": false, 00:15:12.782 "abort": true, 00:15:12.782 "nvme_admin": false, 00:15:12.782 "nvme_io": false 00:15:12.782 }, 00:15:12.782 "memory_domains": [ 00:15:12.782 { 00:15:12.782 "dma_device_id": "system", 00:15:12.782 "dma_device_type": 1 00:15:12.782 }, 00:15:12.782 { 00:15:12.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.782 "dma_device_type": 2 00:15:12.782 } 00:15:12.782 ], 00:15:12.782 "driver_specific": {} 00:15:12.782 } 00:15:12.782 ] 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.782 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.043 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.043 "name": "Existed_Raid", 00:15:13.043 "uuid": "4376d641-ee9c-49fd-a36f-50977a21d380", 00:15:13.043 "strip_size_kb": 64, 00:15:13.043 "state": "online", 00:15:13.043 "raid_level": "raid0", 00:15:13.043 "superblock": false, 00:15:13.043 "num_base_bdevs": 4, 00:15:13.043 "num_base_bdevs_discovered": 4, 00:15:13.043 "num_base_bdevs_operational": 4, 00:15:13.043 "base_bdevs_list": [ 00:15:13.043 { 00:15:13.043 "name": "BaseBdev1", 00:15:13.043 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:13.043 "is_configured": true, 00:15:13.043 "data_offset": 0, 00:15:13.043 "data_size": 65536 00:15:13.043 }, 00:15:13.043 { 00:15:13.043 "name": "BaseBdev2", 00:15:13.043 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:13.043 "is_configured": true, 00:15:13.043 "data_offset": 0, 00:15:13.043 "data_size": 65536 00:15:13.043 }, 00:15:13.043 { 00:15:13.043 "name": "BaseBdev3", 00:15:13.043 "uuid": "c48456a9-970c-4dad-a22a-44e9afa25371", 00:15:13.043 "is_configured": true, 00:15:13.043 "data_offset": 0, 00:15:13.043 "data_size": 65536 00:15:13.043 }, 00:15:13.043 { 00:15:13.043 "name": "BaseBdev4", 00:15:13.043 "uuid": "9cf91652-b910-4ec7-b030-c8db9a0566f0", 00:15:13.043 "is_configured": true, 00:15:13.043 "data_offset": 0, 00:15:13.043 "data_size": 65536 00:15:13.043 } 00:15:13.043 ] 00:15:13.043 }' 00:15:13.043 13:43:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.043 13:43:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:13.615 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:13.875 [2024-06-10 13:43:28.155931] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:13.875 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:13.875 "name": "Existed_Raid", 00:15:13.875 "aliases": [ 00:15:13.875 "4376d641-ee9c-49fd-a36f-50977a21d380" 00:15:13.875 ], 00:15:13.875 "product_name": "Raid Volume", 00:15:13.875 "block_size": 512, 00:15:13.875 "num_blocks": 262144, 00:15:13.876 "uuid": "4376d641-ee9c-49fd-a36f-50977a21d380", 00:15:13.876 "assigned_rate_limits": { 00:15:13.876 "rw_ios_per_sec": 0, 00:15:13.876 "rw_mbytes_per_sec": 0, 00:15:13.876 "r_mbytes_per_sec": 0, 00:15:13.876 "w_mbytes_per_sec": 0 00:15:13.876 }, 00:15:13.876 "claimed": false, 00:15:13.876 "zoned": false, 00:15:13.876 "supported_io_types": { 00:15:13.876 "read": true, 00:15:13.876 "write": true, 00:15:13.876 "unmap": true, 00:15:13.876 "write_zeroes": true, 00:15:13.876 "flush": true, 00:15:13.876 "reset": true, 00:15:13.876 "compare": false, 00:15:13.876 "compare_and_write": false, 00:15:13.876 "abort": false, 00:15:13.876 "nvme_admin": false, 00:15:13.876 "nvme_io": false 00:15:13.876 }, 00:15:13.876 "memory_domains": [ 00:15:13.876 { 00:15:13.876 "dma_device_id": "system", 00:15:13.876 "dma_device_type": 1 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.876 "dma_device_type": 2 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "system", 00:15:13.876 "dma_device_type": 1 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.876 "dma_device_type": 2 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "system", 00:15:13.876 "dma_device_type": 1 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.876 "dma_device_type": 2 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "system", 00:15:13.876 "dma_device_type": 1 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.876 "dma_device_type": 2 00:15:13.876 } 00:15:13.876 ], 00:15:13.876 "driver_specific": { 00:15:13.876 "raid": { 00:15:13.876 "uuid": "4376d641-ee9c-49fd-a36f-50977a21d380", 00:15:13.876 "strip_size_kb": 64, 00:15:13.876 "state": "online", 00:15:13.876 "raid_level": "raid0", 00:15:13.876 "superblock": false, 00:15:13.876 "num_base_bdevs": 4, 00:15:13.876 "num_base_bdevs_discovered": 4, 00:15:13.876 "num_base_bdevs_operational": 4, 00:15:13.876 "base_bdevs_list": [ 00:15:13.876 { 00:15:13.876 "name": "BaseBdev1", 00:15:13.876 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:13.876 "is_configured": true, 00:15:13.876 "data_offset": 0, 00:15:13.876 "data_size": 65536 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "name": "BaseBdev2", 00:15:13.876 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:13.876 "is_configured": true, 00:15:13.876 "data_offset": 0, 00:15:13.876 "data_size": 65536 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "name": "BaseBdev3", 00:15:13.876 "uuid": "c48456a9-970c-4dad-a22a-44e9afa25371", 00:15:13.876 "is_configured": true, 00:15:13.876 "data_offset": 0, 00:15:13.876 "data_size": 65536 00:15:13.876 }, 00:15:13.876 { 00:15:13.876 "name": "BaseBdev4", 00:15:13.876 "uuid": "9cf91652-b910-4ec7-b030-c8db9a0566f0", 00:15:13.876 "is_configured": true, 00:15:13.876 "data_offset": 0, 00:15:13.876 "data_size": 65536 00:15:13.876 } 00:15:13.876 ] 00:15:13.876 } 00:15:13.876 } 00:15:13.876 }' 00:15:13.876 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:13.876 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:13.876 BaseBdev2 00:15:13.876 BaseBdev3 00:15:13.876 BaseBdev4' 00:15:13.876 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:13.876 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:13.876 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.136 "name": "BaseBdev1", 00:15:14.136 "aliases": [ 00:15:14.136 "aec2250c-d8ac-4bf0-97da-175d05cbe95e" 00:15:14.136 ], 00:15:14.136 "product_name": "Malloc disk", 00:15:14.136 "block_size": 512, 00:15:14.136 "num_blocks": 65536, 00:15:14.136 "uuid": "aec2250c-d8ac-4bf0-97da-175d05cbe95e", 00:15:14.136 "assigned_rate_limits": { 00:15:14.136 "rw_ios_per_sec": 0, 00:15:14.136 "rw_mbytes_per_sec": 0, 00:15:14.136 "r_mbytes_per_sec": 0, 00:15:14.136 "w_mbytes_per_sec": 0 00:15:14.136 }, 00:15:14.136 "claimed": true, 00:15:14.136 "claim_type": "exclusive_write", 00:15:14.136 "zoned": false, 00:15:14.136 "supported_io_types": { 00:15:14.136 "read": true, 00:15:14.136 "write": true, 00:15:14.136 "unmap": true, 00:15:14.136 "write_zeroes": true, 00:15:14.136 "flush": true, 00:15:14.136 "reset": true, 00:15:14.136 "compare": false, 00:15:14.136 "compare_and_write": false, 00:15:14.136 "abort": true, 00:15:14.136 "nvme_admin": false, 00:15:14.136 "nvme_io": false 00:15:14.136 }, 00:15:14.136 "memory_domains": [ 00:15:14.136 { 00:15:14.136 "dma_device_id": "system", 00:15:14.136 "dma_device_type": 1 00:15:14.136 }, 00:15:14.136 { 00:15:14.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.136 "dma_device_type": 2 00:15:14.136 } 00:15:14.136 ], 00:15:14.136 "driver_specific": {} 00:15:14.136 }' 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.136 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.396 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:14.656 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.656 "name": "BaseBdev2", 00:15:14.656 "aliases": [ 00:15:14.656 "69ba6171-51b8-4d04-84f4-389995928cdb" 00:15:14.656 ], 00:15:14.656 "product_name": "Malloc disk", 00:15:14.656 "block_size": 512, 00:15:14.656 "num_blocks": 65536, 00:15:14.656 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:14.656 "assigned_rate_limits": { 00:15:14.656 "rw_ios_per_sec": 0, 00:15:14.656 "rw_mbytes_per_sec": 0, 00:15:14.656 "r_mbytes_per_sec": 0, 00:15:14.656 "w_mbytes_per_sec": 0 00:15:14.656 }, 00:15:14.656 "claimed": true, 00:15:14.656 "claim_type": "exclusive_write", 00:15:14.656 "zoned": false, 00:15:14.656 "supported_io_types": { 00:15:14.656 "read": true, 00:15:14.656 "write": true, 00:15:14.656 "unmap": true, 00:15:14.656 "write_zeroes": true, 00:15:14.656 "flush": true, 00:15:14.656 "reset": true, 00:15:14.656 "compare": false, 00:15:14.656 "compare_and_write": false, 00:15:14.656 "abort": true, 00:15:14.656 "nvme_admin": false, 00:15:14.656 "nvme_io": false 00:15:14.656 }, 00:15:14.656 "memory_domains": [ 00:15:14.656 { 00:15:14.656 "dma_device_id": "system", 00:15:14.656 "dma_device_type": 1 00:15:14.656 }, 00:15:14.656 { 00:15:14.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.656 "dma_device_type": 2 00:15:14.656 } 00:15:14.656 ], 00:15:14.656 "driver_specific": {} 00:15:14.656 }' 00:15:14.656 13:43:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.656 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.656 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:14.656 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.656 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.917 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:15.177 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.177 "name": "BaseBdev3", 00:15:15.177 "aliases": [ 00:15:15.177 "c48456a9-970c-4dad-a22a-44e9afa25371" 00:15:15.177 ], 00:15:15.177 "product_name": "Malloc disk", 00:15:15.177 "block_size": 512, 00:15:15.177 "num_blocks": 65536, 00:15:15.177 "uuid": "c48456a9-970c-4dad-a22a-44e9afa25371", 00:15:15.177 "assigned_rate_limits": { 00:15:15.177 "rw_ios_per_sec": 0, 00:15:15.177 "rw_mbytes_per_sec": 0, 00:15:15.177 "r_mbytes_per_sec": 0, 00:15:15.177 "w_mbytes_per_sec": 0 00:15:15.177 }, 00:15:15.177 "claimed": true, 00:15:15.177 "claim_type": "exclusive_write", 00:15:15.177 "zoned": false, 00:15:15.177 "supported_io_types": { 00:15:15.177 "read": true, 00:15:15.177 "write": true, 00:15:15.177 "unmap": true, 00:15:15.177 "write_zeroes": true, 00:15:15.177 "flush": true, 00:15:15.177 "reset": true, 00:15:15.177 "compare": false, 00:15:15.177 "compare_and_write": false, 00:15:15.177 "abort": true, 00:15:15.177 "nvme_admin": false, 00:15:15.177 "nvme_io": false 00:15:15.177 }, 00:15:15.177 "memory_domains": [ 00:15:15.177 { 00:15:15.177 "dma_device_id": "system", 00:15:15.177 "dma_device_type": 1 00:15:15.177 }, 00:15:15.177 { 00:15:15.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.177 "dma_device_type": 2 00:15:15.177 } 00:15:15.177 ], 00:15:15.177 "driver_specific": {} 00:15:15.177 }' 00:15:15.177 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.177 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.438 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.698 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.698 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.698 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:15.698 13:43:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:15.698 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.698 "name": "BaseBdev4", 00:15:15.698 "aliases": [ 00:15:15.698 "9cf91652-b910-4ec7-b030-c8db9a0566f0" 00:15:15.698 ], 00:15:15.698 "product_name": "Malloc disk", 00:15:15.698 "block_size": 512, 00:15:15.698 "num_blocks": 65536, 00:15:15.698 "uuid": "9cf91652-b910-4ec7-b030-c8db9a0566f0", 00:15:15.698 "assigned_rate_limits": { 00:15:15.698 "rw_ios_per_sec": 0, 00:15:15.698 "rw_mbytes_per_sec": 0, 00:15:15.698 "r_mbytes_per_sec": 0, 00:15:15.698 "w_mbytes_per_sec": 0 00:15:15.698 }, 00:15:15.698 "claimed": true, 00:15:15.698 "claim_type": "exclusive_write", 00:15:15.698 "zoned": false, 00:15:15.698 "supported_io_types": { 00:15:15.698 "read": true, 00:15:15.698 "write": true, 00:15:15.698 "unmap": true, 00:15:15.698 "write_zeroes": true, 00:15:15.698 "flush": true, 00:15:15.698 "reset": true, 00:15:15.698 "compare": false, 00:15:15.698 "compare_and_write": false, 00:15:15.698 "abort": true, 00:15:15.698 "nvme_admin": false, 00:15:15.698 "nvme_io": false 00:15:15.698 }, 00:15:15.698 "memory_domains": [ 00:15:15.698 { 00:15:15.698 "dma_device_id": "system", 00:15:15.698 "dma_device_type": 1 00:15:15.698 }, 00:15:15.698 { 00:15:15.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.698 "dma_device_type": 2 00:15:15.698 } 00:15:15.698 ], 00:15:15.698 "driver_specific": {} 00:15:15.698 }' 00:15:15.698 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.698 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.958 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.218 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.219 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.219 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:16.219 [2024-06-10 13:43:30.678220] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:16.219 [2024-06-10 13:43:30.678241] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:16.219 [2024-06-10 13:43:30.678280] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.479 "name": "Existed_Raid", 00:15:16.479 "uuid": "4376d641-ee9c-49fd-a36f-50977a21d380", 00:15:16.479 "strip_size_kb": 64, 00:15:16.479 "state": "offline", 00:15:16.479 "raid_level": "raid0", 00:15:16.479 "superblock": false, 00:15:16.479 "num_base_bdevs": 4, 00:15:16.479 "num_base_bdevs_discovered": 3, 00:15:16.479 "num_base_bdevs_operational": 3, 00:15:16.479 "base_bdevs_list": [ 00:15:16.479 { 00:15:16.479 "name": null, 00:15:16.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.479 "is_configured": false, 00:15:16.479 "data_offset": 0, 00:15:16.479 "data_size": 65536 00:15:16.479 }, 00:15:16.479 { 00:15:16.479 "name": "BaseBdev2", 00:15:16.479 "uuid": "69ba6171-51b8-4d04-84f4-389995928cdb", 00:15:16.479 "is_configured": true, 00:15:16.479 "data_offset": 0, 00:15:16.479 "data_size": 65536 00:15:16.479 }, 00:15:16.479 { 00:15:16.479 "name": "BaseBdev3", 00:15:16.479 "uuid": "c48456a9-970c-4dad-a22a-44e9afa25371", 00:15:16.479 "is_configured": true, 00:15:16.479 "data_offset": 0, 00:15:16.479 "data_size": 65536 00:15:16.479 }, 00:15:16.479 { 00:15:16.479 "name": "BaseBdev4", 00:15:16.479 "uuid": "9cf91652-b910-4ec7-b030-c8db9a0566f0", 00:15:16.479 "is_configured": true, 00:15:16.479 "data_offset": 0, 00:15:16.479 "data_size": 65536 00:15:16.479 } 00:15:16.479 ] 00:15:16.479 }' 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.479 13:43:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.051 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:17.051 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:17.051 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.051 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:17.311 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:17.311 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:17.311 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:17.572 [2024-06-10 13:43:31.825147] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:17.572 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:17.572 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:17.572 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.572 13:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:17.832 [2024-06-10 13:43:32.236207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.832 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:18.092 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:18.092 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:18.092 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:18.353 [2024-06-10 13:43:32.647242] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:18.353 [2024-06-10 13:43:32.647273] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2400160 name Existed_Raid, state offline 00:15:18.353 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:18.353 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:18.353 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.353 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:18.613 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:18.613 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:18.613 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:18.613 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:18.613 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:18.613 13:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:18.613 BaseBdev2 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:18.613 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.873 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:19.134 [ 00:15:19.134 { 00:15:19.134 "name": "BaseBdev2", 00:15:19.134 "aliases": [ 00:15:19.134 "d57aeb8b-8905-4216-b38e-cd3cb9f61772" 00:15:19.134 ], 00:15:19.134 "product_name": "Malloc disk", 00:15:19.134 "block_size": 512, 00:15:19.134 "num_blocks": 65536, 00:15:19.134 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:19.134 "assigned_rate_limits": { 00:15:19.134 "rw_ios_per_sec": 0, 00:15:19.134 "rw_mbytes_per_sec": 0, 00:15:19.134 "r_mbytes_per_sec": 0, 00:15:19.134 "w_mbytes_per_sec": 0 00:15:19.134 }, 00:15:19.134 "claimed": false, 00:15:19.134 "zoned": false, 00:15:19.134 "supported_io_types": { 00:15:19.134 "read": true, 00:15:19.134 "write": true, 00:15:19.134 "unmap": true, 00:15:19.135 "write_zeroes": true, 00:15:19.135 "flush": true, 00:15:19.135 "reset": true, 00:15:19.135 "compare": false, 00:15:19.135 "compare_and_write": false, 00:15:19.135 "abort": true, 00:15:19.135 "nvme_admin": false, 00:15:19.135 "nvme_io": false 00:15:19.135 }, 00:15:19.135 "memory_domains": [ 00:15:19.135 { 00:15:19.135 "dma_device_id": "system", 00:15:19.135 "dma_device_type": 1 00:15:19.135 }, 00:15:19.135 { 00:15:19.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.135 "dma_device_type": 2 00:15:19.135 } 00:15:19.135 ], 00:15:19.135 "driver_specific": {} 00:15:19.135 } 00:15:19.135 ] 00:15:19.135 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:19.135 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:19.135 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:19.135 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:19.396 BaseBdev3 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.396 13:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:19.656 [ 00:15:19.656 { 00:15:19.656 "name": "BaseBdev3", 00:15:19.656 "aliases": [ 00:15:19.656 "2df923b7-9de3-4cfe-995e-714c8172ef73" 00:15:19.656 ], 00:15:19.656 "product_name": "Malloc disk", 00:15:19.656 "block_size": 512, 00:15:19.656 "num_blocks": 65536, 00:15:19.656 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:19.656 "assigned_rate_limits": { 00:15:19.656 "rw_ios_per_sec": 0, 00:15:19.656 "rw_mbytes_per_sec": 0, 00:15:19.656 "r_mbytes_per_sec": 0, 00:15:19.656 "w_mbytes_per_sec": 0 00:15:19.656 }, 00:15:19.656 "claimed": false, 00:15:19.656 "zoned": false, 00:15:19.656 "supported_io_types": { 00:15:19.656 "read": true, 00:15:19.656 "write": true, 00:15:19.656 "unmap": true, 00:15:19.656 "write_zeroes": true, 00:15:19.656 "flush": true, 00:15:19.656 "reset": true, 00:15:19.656 "compare": false, 00:15:19.656 "compare_and_write": false, 00:15:19.656 "abort": true, 00:15:19.656 "nvme_admin": false, 00:15:19.656 "nvme_io": false 00:15:19.656 }, 00:15:19.656 "memory_domains": [ 00:15:19.656 { 00:15:19.656 "dma_device_id": "system", 00:15:19.656 "dma_device_type": 1 00:15:19.656 }, 00:15:19.656 { 00:15:19.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.656 "dma_device_type": 2 00:15:19.656 } 00:15:19.656 ], 00:15:19.656 "driver_specific": {} 00:15:19.656 } 00:15:19.657 ] 00:15:19.657 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:19.657 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:19.657 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:19.657 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:19.918 BaseBdev4 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:19.918 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.179 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:20.179 [ 00:15:20.179 { 00:15:20.179 "name": "BaseBdev4", 00:15:20.179 "aliases": [ 00:15:20.179 "e477afe4-8211-4f8e-b1ef-72537b53778d" 00:15:20.179 ], 00:15:20.179 "product_name": "Malloc disk", 00:15:20.179 "block_size": 512, 00:15:20.179 "num_blocks": 65536, 00:15:20.179 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:20.179 "assigned_rate_limits": { 00:15:20.179 "rw_ios_per_sec": 0, 00:15:20.179 "rw_mbytes_per_sec": 0, 00:15:20.179 "r_mbytes_per_sec": 0, 00:15:20.179 "w_mbytes_per_sec": 0 00:15:20.179 }, 00:15:20.179 "claimed": false, 00:15:20.179 "zoned": false, 00:15:20.179 "supported_io_types": { 00:15:20.179 "read": true, 00:15:20.179 "write": true, 00:15:20.179 "unmap": true, 00:15:20.179 "write_zeroes": true, 00:15:20.179 "flush": true, 00:15:20.179 "reset": true, 00:15:20.179 "compare": false, 00:15:20.179 "compare_and_write": false, 00:15:20.179 "abort": true, 00:15:20.179 "nvme_admin": false, 00:15:20.179 "nvme_io": false 00:15:20.179 }, 00:15:20.179 "memory_domains": [ 00:15:20.179 { 00:15:20.179 "dma_device_id": "system", 00:15:20.179 "dma_device_type": 1 00:15:20.179 }, 00:15:20.179 { 00:15:20.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.179 "dma_device_type": 2 00:15:20.179 } 00:15:20.179 ], 00:15:20.179 "driver_specific": {} 00:15:20.179 } 00:15:20.179 ] 00:15:20.179 13:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:20.179 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:20.179 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:20.179 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:20.440 [2024-06-10 13:43:34.835269] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:20.440 [2024-06-10 13:43:34.835301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:20.440 [2024-06-10 13:43:34.835315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:20.440 [2024-06-10 13:43:34.836412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:20.440 [2024-06-10 13:43:34.836443] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.440 13:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.700 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.700 "name": "Existed_Raid", 00:15:20.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.700 "strip_size_kb": 64, 00:15:20.700 "state": "configuring", 00:15:20.700 "raid_level": "raid0", 00:15:20.700 "superblock": false, 00:15:20.700 "num_base_bdevs": 4, 00:15:20.700 "num_base_bdevs_discovered": 3, 00:15:20.700 "num_base_bdevs_operational": 4, 00:15:20.700 "base_bdevs_list": [ 00:15:20.700 { 00:15:20.700 "name": "BaseBdev1", 00:15:20.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.700 "is_configured": false, 00:15:20.700 "data_offset": 0, 00:15:20.700 "data_size": 0 00:15:20.700 }, 00:15:20.700 { 00:15:20.700 "name": "BaseBdev2", 00:15:20.700 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:20.700 "is_configured": true, 00:15:20.700 "data_offset": 0, 00:15:20.700 "data_size": 65536 00:15:20.700 }, 00:15:20.700 { 00:15:20.700 "name": "BaseBdev3", 00:15:20.700 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:20.700 "is_configured": true, 00:15:20.700 "data_offset": 0, 00:15:20.700 "data_size": 65536 00:15:20.700 }, 00:15:20.700 { 00:15:20.700 "name": "BaseBdev4", 00:15:20.700 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:20.700 "is_configured": true, 00:15:20.700 "data_offset": 0, 00:15:20.700 "data_size": 65536 00:15:20.700 } 00:15:20.700 ] 00:15:20.700 }' 00:15:20.700 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.700 13:43:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.270 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:21.531 [2024-06-10 13:43:35.793676] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.531 13:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.791 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.791 "name": "Existed_Raid", 00:15:21.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.791 "strip_size_kb": 64, 00:15:21.791 "state": "configuring", 00:15:21.791 "raid_level": "raid0", 00:15:21.791 "superblock": false, 00:15:21.791 "num_base_bdevs": 4, 00:15:21.791 "num_base_bdevs_discovered": 2, 00:15:21.791 "num_base_bdevs_operational": 4, 00:15:21.791 "base_bdevs_list": [ 00:15:21.791 { 00:15:21.791 "name": "BaseBdev1", 00:15:21.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.791 "is_configured": false, 00:15:21.791 "data_offset": 0, 00:15:21.791 "data_size": 0 00:15:21.791 }, 00:15:21.791 { 00:15:21.791 "name": null, 00:15:21.791 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:21.791 "is_configured": false, 00:15:21.791 "data_offset": 0, 00:15:21.791 "data_size": 65536 00:15:21.791 }, 00:15:21.791 { 00:15:21.791 "name": "BaseBdev3", 00:15:21.791 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:21.791 "is_configured": true, 00:15:21.791 "data_offset": 0, 00:15:21.791 "data_size": 65536 00:15:21.791 }, 00:15:21.791 { 00:15:21.791 "name": "BaseBdev4", 00:15:21.791 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:21.791 "is_configured": true, 00:15:21.791 "data_offset": 0, 00:15:21.791 "data_size": 65536 00:15:21.791 } 00:15:21.791 ] 00:15:21.791 }' 00:15:21.791 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.791 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:22.362 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.362 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:22.362 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:22.362 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:22.622 [2024-06-10 13:43:36.973805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:22.622 BaseBdev1 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:22.622 13:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.882 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:23.142 [ 00:15:23.142 { 00:15:23.142 "name": "BaseBdev1", 00:15:23.142 "aliases": [ 00:15:23.142 "65b80383-238e-40c8-8542-3eaf5b018b8c" 00:15:23.142 ], 00:15:23.143 "product_name": "Malloc disk", 00:15:23.143 "block_size": 512, 00:15:23.143 "num_blocks": 65536, 00:15:23.143 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:23.143 "assigned_rate_limits": { 00:15:23.143 "rw_ios_per_sec": 0, 00:15:23.143 "rw_mbytes_per_sec": 0, 00:15:23.143 "r_mbytes_per_sec": 0, 00:15:23.143 "w_mbytes_per_sec": 0 00:15:23.143 }, 00:15:23.143 "claimed": true, 00:15:23.143 "claim_type": "exclusive_write", 00:15:23.143 "zoned": false, 00:15:23.143 "supported_io_types": { 00:15:23.143 "read": true, 00:15:23.143 "write": true, 00:15:23.143 "unmap": true, 00:15:23.143 "write_zeroes": true, 00:15:23.143 "flush": true, 00:15:23.143 "reset": true, 00:15:23.143 "compare": false, 00:15:23.143 "compare_and_write": false, 00:15:23.143 "abort": true, 00:15:23.143 "nvme_admin": false, 00:15:23.143 "nvme_io": false 00:15:23.143 }, 00:15:23.143 "memory_domains": [ 00:15:23.143 { 00:15:23.143 "dma_device_id": "system", 00:15:23.143 "dma_device_type": 1 00:15:23.143 }, 00:15:23.143 { 00:15:23.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.143 "dma_device_type": 2 00:15:23.143 } 00:15:23.143 ], 00:15:23.143 "driver_specific": {} 00:15:23.143 } 00:15:23.143 ] 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.143 "name": "Existed_Raid", 00:15:23.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.143 "strip_size_kb": 64, 00:15:23.143 "state": "configuring", 00:15:23.143 "raid_level": "raid0", 00:15:23.143 "superblock": false, 00:15:23.143 "num_base_bdevs": 4, 00:15:23.143 "num_base_bdevs_discovered": 3, 00:15:23.143 "num_base_bdevs_operational": 4, 00:15:23.143 "base_bdevs_list": [ 00:15:23.143 { 00:15:23.143 "name": "BaseBdev1", 00:15:23.143 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:23.143 "is_configured": true, 00:15:23.143 "data_offset": 0, 00:15:23.143 "data_size": 65536 00:15:23.143 }, 00:15:23.143 { 00:15:23.143 "name": null, 00:15:23.143 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:23.143 "is_configured": false, 00:15:23.143 "data_offset": 0, 00:15:23.143 "data_size": 65536 00:15:23.143 }, 00:15:23.143 { 00:15:23.143 "name": "BaseBdev3", 00:15:23.143 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:23.143 "is_configured": true, 00:15:23.143 "data_offset": 0, 00:15:23.143 "data_size": 65536 00:15:23.143 }, 00:15:23.143 { 00:15:23.143 "name": "BaseBdev4", 00:15:23.143 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:23.143 "is_configured": true, 00:15:23.143 "data_offset": 0, 00:15:23.143 "data_size": 65536 00:15:23.143 } 00:15:23.143 ] 00:15:23.143 }' 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.143 13:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.713 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.713 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:23.975 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:23.975 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:24.236 [2024-06-10 13:43:38.541794] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.236 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.498 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.498 "name": "Existed_Raid", 00:15:24.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.498 "strip_size_kb": 64, 00:15:24.498 "state": "configuring", 00:15:24.498 "raid_level": "raid0", 00:15:24.498 "superblock": false, 00:15:24.498 "num_base_bdevs": 4, 00:15:24.498 "num_base_bdevs_discovered": 2, 00:15:24.498 "num_base_bdevs_operational": 4, 00:15:24.498 "base_bdevs_list": [ 00:15:24.498 { 00:15:24.498 "name": "BaseBdev1", 00:15:24.498 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:24.498 "is_configured": true, 00:15:24.498 "data_offset": 0, 00:15:24.498 "data_size": 65536 00:15:24.498 }, 00:15:24.498 { 00:15:24.498 "name": null, 00:15:24.498 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:24.498 "is_configured": false, 00:15:24.498 "data_offset": 0, 00:15:24.498 "data_size": 65536 00:15:24.498 }, 00:15:24.498 { 00:15:24.498 "name": null, 00:15:24.498 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:24.498 "is_configured": false, 00:15:24.498 "data_offset": 0, 00:15:24.498 "data_size": 65536 00:15:24.498 }, 00:15:24.498 { 00:15:24.498 "name": "BaseBdev4", 00:15:24.498 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:24.498 "is_configured": true, 00:15:24.498 "data_offset": 0, 00:15:24.498 "data_size": 65536 00:15:24.498 } 00:15:24.498 ] 00:15:24.498 }' 00:15:24.498 13:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.498 13:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.085 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.085 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:25.086 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:25.086 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:25.361 [2024-06-10 13:43:39.636586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.361 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.621 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.621 "name": "Existed_Raid", 00:15:25.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.621 "strip_size_kb": 64, 00:15:25.621 "state": "configuring", 00:15:25.621 "raid_level": "raid0", 00:15:25.621 "superblock": false, 00:15:25.621 "num_base_bdevs": 4, 00:15:25.621 "num_base_bdevs_discovered": 3, 00:15:25.621 "num_base_bdevs_operational": 4, 00:15:25.621 "base_bdevs_list": [ 00:15:25.621 { 00:15:25.621 "name": "BaseBdev1", 00:15:25.621 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:25.621 "is_configured": true, 00:15:25.621 "data_offset": 0, 00:15:25.621 "data_size": 65536 00:15:25.621 }, 00:15:25.621 { 00:15:25.621 "name": null, 00:15:25.621 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:25.621 "is_configured": false, 00:15:25.621 "data_offset": 0, 00:15:25.621 "data_size": 65536 00:15:25.621 }, 00:15:25.621 { 00:15:25.621 "name": "BaseBdev3", 00:15:25.621 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:25.621 "is_configured": true, 00:15:25.621 "data_offset": 0, 00:15:25.621 "data_size": 65536 00:15:25.621 }, 00:15:25.621 { 00:15:25.621 "name": "BaseBdev4", 00:15:25.621 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:25.621 "is_configured": true, 00:15:25.621 "data_offset": 0, 00:15:25.621 "data_size": 65536 00:15:25.621 } 00:15:25.621 ] 00:15:25.621 }' 00:15:25.621 13:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.621 13:43:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:26.192 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.192 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:26.192 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:26.192 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:26.452 [2024-06-10 13:43:40.819723] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.452 13:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.714 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.714 "name": "Existed_Raid", 00:15:26.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:26.714 "strip_size_kb": 64, 00:15:26.714 "state": "configuring", 00:15:26.714 "raid_level": "raid0", 00:15:26.714 "superblock": false, 00:15:26.714 "num_base_bdevs": 4, 00:15:26.714 "num_base_bdevs_discovered": 2, 00:15:26.714 "num_base_bdevs_operational": 4, 00:15:26.714 "base_bdevs_list": [ 00:15:26.714 { 00:15:26.714 "name": null, 00:15:26.714 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:26.714 "is_configured": false, 00:15:26.714 "data_offset": 0, 00:15:26.714 "data_size": 65536 00:15:26.714 }, 00:15:26.714 { 00:15:26.714 "name": null, 00:15:26.714 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:26.714 "is_configured": false, 00:15:26.714 "data_offset": 0, 00:15:26.714 "data_size": 65536 00:15:26.714 }, 00:15:26.714 { 00:15:26.714 "name": "BaseBdev3", 00:15:26.714 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:26.714 "is_configured": true, 00:15:26.714 "data_offset": 0, 00:15:26.714 "data_size": 65536 00:15:26.714 }, 00:15:26.714 { 00:15:26.714 "name": "BaseBdev4", 00:15:26.714 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:26.714 "is_configured": true, 00:15:26.714 "data_offset": 0, 00:15:26.714 "data_size": 65536 00:15:26.714 } 00:15:26.714 ] 00:15:26.714 }' 00:15:26.714 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.714 13:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.285 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.285 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:27.545 [2024-06-10 13:43:41.968539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.545 13:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.815 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.815 "name": "Existed_Raid", 00:15:27.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.815 "strip_size_kb": 64, 00:15:27.815 "state": "configuring", 00:15:27.815 "raid_level": "raid0", 00:15:27.815 "superblock": false, 00:15:27.815 "num_base_bdevs": 4, 00:15:27.815 "num_base_bdevs_discovered": 3, 00:15:27.815 "num_base_bdevs_operational": 4, 00:15:27.815 "base_bdevs_list": [ 00:15:27.815 { 00:15:27.815 "name": null, 00:15:27.816 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:27.816 "is_configured": false, 00:15:27.816 "data_offset": 0, 00:15:27.816 "data_size": 65536 00:15:27.816 }, 00:15:27.816 { 00:15:27.816 "name": "BaseBdev2", 00:15:27.816 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:27.816 "is_configured": true, 00:15:27.816 "data_offset": 0, 00:15:27.816 "data_size": 65536 00:15:27.816 }, 00:15:27.816 { 00:15:27.816 "name": "BaseBdev3", 00:15:27.816 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:27.816 "is_configured": true, 00:15:27.816 "data_offset": 0, 00:15:27.816 "data_size": 65536 00:15:27.816 }, 00:15:27.816 { 00:15:27.816 "name": "BaseBdev4", 00:15:27.816 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:27.816 "is_configured": true, 00:15:27.816 "data_offset": 0, 00:15:27.816 "data_size": 65536 00:15:27.816 } 00:15:27.816 ] 00:15:27.816 }' 00:15:27.816 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.816 13:43:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.391 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.391 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:28.652 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:28.652 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.652 13:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:28.652 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 65b80383-238e-40c8-8542-3eaf5b018b8c 00:15:28.912 [2024-06-10 13:43:43.300997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:28.912 [2024-06-10 13:43:43.301022] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23fe030 00:15:28.912 [2024-06-10 13:43:43.301027] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:28.912 [2024-06-10 13:43:43.301193] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2403ad0 00:15:28.912 [2024-06-10 13:43:43.301292] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23fe030 00:15:28.912 [2024-06-10 13:43:43.301298] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23fe030 00:15:28.913 [2024-06-10 13:43:43.301422] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:28.913 NewBaseBdev 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:28.913 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.173 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:29.433 [ 00:15:29.433 { 00:15:29.433 "name": "NewBaseBdev", 00:15:29.433 "aliases": [ 00:15:29.433 "65b80383-238e-40c8-8542-3eaf5b018b8c" 00:15:29.433 ], 00:15:29.433 "product_name": "Malloc disk", 00:15:29.433 "block_size": 512, 00:15:29.433 "num_blocks": 65536, 00:15:29.433 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:29.433 "assigned_rate_limits": { 00:15:29.433 "rw_ios_per_sec": 0, 00:15:29.433 "rw_mbytes_per_sec": 0, 00:15:29.433 "r_mbytes_per_sec": 0, 00:15:29.433 "w_mbytes_per_sec": 0 00:15:29.433 }, 00:15:29.433 "claimed": true, 00:15:29.433 "claim_type": "exclusive_write", 00:15:29.433 "zoned": false, 00:15:29.433 "supported_io_types": { 00:15:29.433 "read": true, 00:15:29.433 "write": true, 00:15:29.433 "unmap": true, 00:15:29.433 "write_zeroes": true, 00:15:29.433 "flush": true, 00:15:29.433 "reset": true, 00:15:29.433 "compare": false, 00:15:29.433 "compare_and_write": false, 00:15:29.433 "abort": true, 00:15:29.433 "nvme_admin": false, 00:15:29.433 "nvme_io": false 00:15:29.433 }, 00:15:29.433 "memory_domains": [ 00:15:29.433 { 00:15:29.433 "dma_device_id": "system", 00:15:29.433 "dma_device_type": 1 00:15:29.433 }, 00:15:29.433 { 00:15:29.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.433 "dma_device_type": 2 00:15:29.433 } 00:15:29.433 ], 00:15:29.433 "driver_specific": {} 00:15:29.433 } 00:15:29.433 ] 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.433 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.694 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.694 "name": "Existed_Raid", 00:15:29.694 "uuid": "306fefd6-d27a-4d84-b83c-d1b00747c117", 00:15:29.694 "strip_size_kb": 64, 00:15:29.694 "state": "online", 00:15:29.694 "raid_level": "raid0", 00:15:29.694 "superblock": false, 00:15:29.694 "num_base_bdevs": 4, 00:15:29.694 "num_base_bdevs_discovered": 4, 00:15:29.694 "num_base_bdevs_operational": 4, 00:15:29.694 "base_bdevs_list": [ 00:15:29.694 { 00:15:29.694 "name": "NewBaseBdev", 00:15:29.694 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:29.694 "is_configured": true, 00:15:29.694 "data_offset": 0, 00:15:29.694 "data_size": 65536 00:15:29.694 }, 00:15:29.694 { 00:15:29.694 "name": "BaseBdev2", 00:15:29.694 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:29.694 "is_configured": true, 00:15:29.694 "data_offset": 0, 00:15:29.694 "data_size": 65536 00:15:29.694 }, 00:15:29.694 { 00:15:29.694 "name": "BaseBdev3", 00:15:29.694 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:29.694 "is_configured": true, 00:15:29.694 "data_offset": 0, 00:15:29.694 "data_size": 65536 00:15:29.694 }, 00:15:29.694 { 00:15:29.694 "name": "BaseBdev4", 00:15:29.694 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:29.694 "is_configured": true, 00:15:29.694 "data_offset": 0, 00:15:29.694 "data_size": 65536 00:15:29.694 } 00:15:29.694 ] 00:15:29.694 }' 00:15:29.694 13:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.694 13:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:30.265 [2024-06-10 13:43:44.668840] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:30.265 "name": "Existed_Raid", 00:15:30.265 "aliases": [ 00:15:30.265 "306fefd6-d27a-4d84-b83c-d1b00747c117" 00:15:30.265 ], 00:15:30.265 "product_name": "Raid Volume", 00:15:30.265 "block_size": 512, 00:15:30.265 "num_blocks": 262144, 00:15:30.265 "uuid": "306fefd6-d27a-4d84-b83c-d1b00747c117", 00:15:30.265 "assigned_rate_limits": { 00:15:30.265 "rw_ios_per_sec": 0, 00:15:30.265 "rw_mbytes_per_sec": 0, 00:15:30.265 "r_mbytes_per_sec": 0, 00:15:30.265 "w_mbytes_per_sec": 0 00:15:30.265 }, 00:15:30.265 "claimed": false, 00:15:30.265 "zoned": false, 00:15:30.265 "supported_io_types": { 00:15:30.265 "read": true, 00:15:30.265 "write": true, 00:15:30.265 "unmap": true, 00:15:30.265 "write_zeroes": true, 00:15:30.265 "flush": true, 00:15:30.265 "reset": true, 00:15:30.265 "compare": false, 00:15:30.265 "compare_and_write": false, 00:15:30.265 "abort": false, 00:15:30.265 "nvme_admin": false, 00:15:30.265 "nvme_io": false 00:15:30.265 }, 00:15:30.265 "memory_domains": [ 00:15:30.265 { 00:15:30.265 "dma_device_id": "system", 00:15:30.265 "dma_device_type": 1 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.265 "dma_device_type": 2 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "system", 00:15:30.265 "dma_device_type": 1 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.265 "dma_device_type": 2 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "system", 00:15:30.265 "dma_device_type": 1 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.265 "dma_device_type": 2 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "system", 00:15:30.265 "dma_device_type": 1 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.265 "dma_device_type": 2 00:15:30.265 } 00:15:30.265 ], 00:15:30.265 "driver_specific": { 00:15:30.265 "raid": { 00:15:30.265 "uuid": "306fefd6-d27a-4d84-b83c-d1b00747c117", 00:15:30.265 "strip_size_kb": 64, 00:15:30.265 "state": "online", 00:15:30.265 "raid_level": "raid0", 00:15:30.265 "superblock": false, 00:15:30.265 "num_base_bdevs": 4, 00:15:30.265 "num_base_bdevs_discovered": 4, 00:15:30.265 "num_base_bdevs_operational": 4, 00:15:30.265 "base_bdevs_list": [ 00:15:30.265 { 00:15:30.265 "name": "NewBaseBdev", 00:15:30.265 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:30.265 "is_configured": true, 00:15:30.265 "data_offset": 0, 00:15:30.265 "data_size": 65536 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "name": "BaseBdev2", 00:15:30.265 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:30.265 "is_configured": true, 00:15:30.265 "data_offset": 0, 00:15:30.265 "data_size": 65536 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "name": "BaseBdev3", 00:15:30.265 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:30.265 "is_configured": true, 00:15:30.265 "data_offset": 0, 00:15:30.265 "data_size": 65536 00:15:30.265 }, 00:15:30.265 { 00:15:30.265 "name": "BaseBdev4", 00:15:30.265 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:30.265 "is_configured": true, 00:15:30.265 "data_offset": 0, 00:15:30.265 "data_size": 65536 00:15:30.265 } 00:15:30.265 ] 00:15:30.265 } 00:15:30.265 } 00:15:30.265 }' 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:30.265 BaseBdev2 00:15:30.265 BaseBdev3 00:15:30.265 BaseBdev4' 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:30.265 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:30.526 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:30.526 "name": "NewBaseBdev", 00:15:30.526 "aliases": [ 00:15:30.526 "65b80383-238e-40c8-8542-3eaf5b018b8c" 00:15:30.526 ], 00:15:30.526 "product_name": "Malloc disk", 00:15:30.526 "block_size": 512, 00:15:30.526 "num_blocks": 65536, 00:15:30.526 "uuid": "65b80383-238e-40c8-8542-3eaf5b018b8c", 00:15:30.526 "assigned_rate_limits": { 00:15:30.526 "rw_ios_per_sec": 0, 00:15:30.526 "rw_mbytes_per_sec": 0, 00:15:30.526 "r_mbytes_per_sec": 0, 00:15:30.526 "w_mbytes_per_sec": 0 00:15:30.526 }, 00:15:30.526 "claimed": true, 00:15:30.526 "claim_type": "exclusive_write", 00:15:30.526 "zoned": false, 00:15:30.526 "supported_io_types": { 00:15:30.526 "read": true, 00:15:30.526 "write": true, 00:15:30.526 "unmap": true, 00:15:30.526 "write_zeroes": true, 00:15:30.526 "flush": true, 00:15:30.526 "reset": true, 00:15:30.526 "compare": false, 00:15:30.526 "compare_and_write": false, 00:15:30.526 "abort": true, 00:15:30.526 "nvme_admin": false, 00:15:30.526 "nvme_io": false 00:15:30.526 }, 00:15:30.526 "memory_domains": [ 00:15:30.526 { 00:15:30.526 "dma_device_id": "system", 00:15:30.526 "dma_device_type": 1 00:15:30.526 }, 00:15:30.526 { 00:15:30.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.526 "dma_device_type": 2 00:15:30.526 } 00:15:30.526 ], 00:15:30.526 "driver_specific": {} 00:15:30.526 }' 00:15:30.526 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:30.526 13:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:30.786 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:30.786 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:30.787 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.047 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.047 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.047 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:31.047 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.047 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.047 "name": "BaseBdev2", 00:15:31.047 "aliases": [ 00:15:31.047 "d57aeb8b-8905-4216-b38e-cd3cb9f61772" 00:15:31.047 ], 00:15:31.047 "product_name": "Malloc disk", 00:15:31.047 "block_size": 512, 00:15:31.047 "num_blocks": 65536, 00:15:31.047 "uuid": "d57aeb8b-8905-4216-b38e-cd3cb9f61772", 00:15:31.047 "assigned_rate_limits": { 00:15:31.047 "rw_ios_per_sec": 0, 00:15:31.047 "rw_mbytes_per_sec": 0, 00:15:31.047 "r_mbytes_per_sec": 0, 00:15:31.047 "w_mbytes_per_sec": 0 00:15:31.047 }, 00:15:31.047 "claimed": true, 00:15:31.047 "claim_type": "exclusive_write", 00:15:31.047 "zoned": false, 00:15:31.047 "supported_io_types": { 00:15:31.047 "read": true, 00:15:31.047 "write": true, 00:15:31.047 "unmap": true, 00:15:31.047 "write_zeroes": true, 00:15:31.047 "flush": true, 00:15:31.047 "reset": true, 00:15:31.047 "compare": false, 00:15:31.047 "compare_and_write": false, 00:15:31.047 "abort": true, 00:15:31.047 "nvme_admin": false, 00:15:31.047 "nvme_io": false 00:15:31.047 }, 00:15:31.047 "memory_domains": [ 00:15:31.047 { 00:15:31.047 "dma_device_id": "system", 00:15:31.047 "dma_device_type": 1 00:15:31.047 }, 00:15:31.047 { 00:15:31.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.047 "dma_device_type": 2 00:15:31.047 } 00:15:31.047 ], 00:15:31.047 "driver_specific": {} 00:15:31.047 }' 00:15:31.047 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:31.307 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.568 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:31.568 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:31.568 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.568 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:31.568 13:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.828 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.829 "name": "BaseBdev3", 00:15:31.829 "aliases": [ 00:15:31.829 "2df923b7-9de3-4cfe-995e-714c8172ef73" 00:15:31.829 ], 00:15:31.829 "product_name": "Malloc disk", 00:15:31.829 "block_size": 512, 00:15:31.829 "num_blocks": 65536, 00:15:31.829 "uuid": "2df923b7-9de3-4cfe-995e-714c8172ef73", 00:15:31.829 "assigned_rate_limits": { 00:15:31.829 "rw_ios_per_sec": 0, 00:15:31.829 "rw_mbytes_per_sec": 0, 00:15:31.829 "r_mbytes_per_sec": 0, 00:15:31.829 "w_mbytes_per_sec": 0 00:15:31.829 }, 00:15:31.829 "claimed": true, 00:15:31.829 "claim_type": "exclusive_write", 00:15:31.829 "zoned": false, 00:15:31.829 "supported_io_types": { 00:15:31.829 "read": true, 00:15:31.829 "write": true, 00:15:31.829 "unmap": true, 00:15:31.829 "write_zeroes": true, 00:15:31.829 "flush": true, 00:15:31.829 "reset": true, 00:15:31.829 "compare": false, 00:15:31.829 "compare_and_write": false, 00:15:31.829 "abort": true, 00:15:31.829 "nvme_admin": false, 00:15:31.829 "nvme_io": false 00:15:31.829 }, 00:15:31.829 "memory_domains": [ 00:15:31.829 { 00:15:31.829 "dma_device_id": "system", 00:15:31.829 "dma_device_type": 1 00:15:31.829 }, 00:15:31.829 { 00:15:31.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.829 "dma_device_type": 2 00:15:31.829 } 00:15:31.829 ], 00:15:31.829 "driver_specific": {} 00:15:31.829 }' 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:31.829 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.089 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.089 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.089 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.089 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.089 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.089 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:32.090 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.350 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.350 "name": "BaseBdev4", 00:15:32.350 "aliases": [ 00:15:32.350 "e477afe4-8211-4f8e-b1ef-72537b53778d" 00:15:32.350 ], 00:15:32.350 "product_name": "Malloc disk", 00:15:32.350 "block_size": 512, 00:15:32.350 "num_blocks": 65536, 00:15:32.350 "uuid": "e477afe4-8211-4f8e-b1ef-72537b53778d", 00:15:32.350 "assigned_rate_limits": { 00:15:32.350 "rw_ios_per_sec": 0, 00:15:32.351 "rw_mbytes_per_sec": 0, 00:15:32.351 "r_mbytes_per_sec": 0, 00:15:32.351 "w_mbytes_per_sec": 0 00:15:32.351 }, 00:15:32.351 "claimed": true, 00:15:32.351 "claim_type": "exclusive_write", 00:15:32.351 "zoned": false, 00:15:32.351 "supported_io_types": { 00:15:32.351 "read": true, 00:15:32.351 "write": true, 00:15:32.351 "unmap": true, 00:15:32.351 "write_zeroes": true, 00:15:32.351 "flush": true, 00:15:32.351 "reset": true, 00:15:32.351 "compare": false, 00:15:32.351 "compare_and_write": false, 00:15:32.351 "abort": true, 00:15:32.351 "nvme_admin": false, 00:15:32.351 "nvme_io": false 00:15:32.351 }, 00:15:32.351 "memory_domains": [ 00:15:32.351 { 00:15:32.351 "dma_device_id": "system", 00:15:32.351 "dma_device_type": 1 00:15:32.351 }, 00:15:32.351 { 00:15:32.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.351 "dma_device_type": 2 00:15:32.351 } 00:15:32.351 ], 00:15:32.351 "driver_specific": {} 00:15:32.351 }' 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.351 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.612 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.612 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.612 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.612 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.612 13:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:32.872 [2024-06-10 13:43:47.146890] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:32.872 [2024-06-10 13:43:47.146907] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:32.872 [2024-06-10 13:43:47.146946] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:32.872 [2024-06-10 13:43:47.146994] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:32.872 [2024-06-10 13:43:47.147000] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23fe030 name Existed_Raid, state offline 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1563368 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1563368 ']' 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1563368 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1563368 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1563368' 00:15:32.872 killing process with pid 1563368 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1563368 00:15:32.872 [2024-06-10 13:43:47.231262] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:32.872 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1563368 00:15:32.872 [2024-06-10 13:43:47.252435] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:33.133 00:15:33.133 real 0m28.137s 00:15:33.133 user 0m52.766s 00:15:33.133 sys 0m4.062s 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.133 ************************************ 00:15:33.133 END TEST raid_state_function_test 00:15:33.133 ************************************ 00:15:33.133 13:43:47 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:15:33.133 13:43:47 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:33.133 13:43:47 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:33.133 13:43:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:33.133 ************************************ 00:15:33.133 START TEST raid_state_function_test_sb 00:15:33.133 ************************************ 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 true 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1569570 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1569570' 00:15:33.133 Process raid pid: 1569570 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1569570 /var/tmp/spdk-raid.sock 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1569570 ']' 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:33.133 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:33.133 13:43:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.133 [2024-06-10 13:43:47.519647] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:15:33.133 [2024-06-10 13:43:47.519705] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:33.393 [2024-06-10 13:43:47.609536] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:33.393 [2024-06-10 13:43:47.679662] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:15:33.393 [2024-06-10 13:43:47.724249] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:33.393 [2024-06-10 13:43:47.724270] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:33.964 13:43:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:33.964 13:43:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:15:33.965 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:34.224 [2024-06-10 13:43:48.568060] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:34.224 [2024-06-10 13:43:48.568090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:34.224 [2024-06-10 13:43:48.568096] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:34.224 [2024-06-10 13:43:48.568103] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:34.224 [2024-06-10 13:43:48.568108] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:34.224 [2024-06-10 13:43:48.568114] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:34.224 [2024-06-10 13:43:48.568118] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:34.224 [2024-06-10 13:43:48.568124] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.224 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.484 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.484 "name": "Existed_Raid", 00:15:34.484 "uuid": "c87c8e3a-568c-486a-9c49-1ce3ac09d7ff", 00:15:34.484 "strip_size_kb": 64, 00:15:34.484 "state": "configuring", 00:15:34.484 "raid_level": "raid0", 00:15:34.484 "superblock": true, 00:15:34.484 "num_base_bdevs": 4, 00:15:34.484 "num_base_bdevs_discovered": 0, 00:15:34.484 "num_base_bdevs_operational": 4, 00:15:34.484 "base_bdevs_list": [ 00:15:34.484 { 00:15:34.484 "name": "BaseBdev1", 00:15:34.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.484 "is_configured": false, 00:15:34.484 "data_offset": 0, 00:15:34.484 "data_size": 0 00:15:34.484 }, 00:15:34.484 { 00:15:34.484 "name": "BaseBdev2", 00:15:34.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.484 "is_configured": false, 00:15:34.484 "data_offset": 0, 00:15:34.484 "data_size": 0 00:15:34.484 }, 00:15:34.484 { 00:15:34.484 "name": "BaseBdev3", 00:15:34.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.484 "is_configured": false, 00:15:34.484 "data_offset": 0, 00:15:34.484 "data_size": 0 00:15:34.484 }, 00:15:34.484 { 00:15:34.484 "name": "BaseBdev4", 00:15:34.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.484 "is_configured": false, 00:15:34.484 "data_offset": 0, 00:15:34.484 "data_size": 0 00:15:34.484 } 00:15:34.484 ] 00:15:34.484 }' 00:15:34.484 13:43:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.484 13:43:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:35.054 13:43:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:35.054 [2024-06-10 13:43:49.498350] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:35.054 [2024-06-10 13:43:49.498372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c9890 name Existed_Raid, state configuring 00:15:35.054 13:43:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:35.314 [2024-06-10 13:43:49.698888] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:35.314 [2024-06-10 13:43:49.698911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:35.314 [2024-06-10 13:43:49.698917] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:35.314 [2024-06-10 13:43:49.698923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:35.314 [2024-06-10 13:43:49.698928] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:35.314 [2024-06-10 13:43:49.698934] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:35.314 [2024-06-10 13:43:49.698939] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:35.314 [2024-06-10 13:43:49.698945] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:35.314 13:43:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:35.575 [2024-06-10 13:43:49.910342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:35.575 BaseBdev1 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:35.575 13:43:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:35.835 13:43:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:36.094 [ 00:15:36.094 { 00:15:36.094 "name": "BaseBdev1", 00:15:36.094 "aliases": [ 00:15:36.094 "8955d9ee-b081-4bfb-8114-469f3e6b948a" 00:15:36.094 ], 00:15:36.094 "product_name": "Malloc disk", 00:15:36.094 "block_size": 512, 00:15:36.094 "num_blocks": 65536, 00:15:36.094 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:36.094 "assigned_rate_limits": { 00:15:36.094 "rw_ios_per_sec": 0, 00:15:36.094 "rw_mbytes_per_sec": 0, 00:15:36.094 "r_mbytes_per_sec": 0, 00:15:36.094 "w_mbytes_per_sec": 0 00:15:36.094 }, 00:15:36.094 "claimed": true, 00:15:36.094 "claim_type": "exclusive_write", 00:15:36.094 "zoned": false, 00:15:36.094 "supported_io_types": { 00:15:36.094 "read": true, 00:15:36.094 "write": true, 00:15:36.094 "unmap": true, 00:15:36.094 "write_zeroes": true, 00:15:36.094 "flush": true, 00:15:36.094 "reset": true, 00:15:36.094 "compare": false, 00:15:36.094 "compare_and_write": false, 00:15:36.094 "abort": true, 00:15:36.094 "nvme_admin": false, 00:15:36.094 "nvme_io": false 00:15:36.094 }, 00:15:36.094 "memory_domains": [ 00:15:36.094 { 00:15:36.094 "dma_device_id": "system", 00:15:36.094 "dma_device_type": 1 00:15:36.094 }, 00:15:36.094 { 00:15:36.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.094 "dma_device_type": 2 00:15:36.094 } 00:15:36.094 ], 00:15:36.094 "driver_specific": {} 00:15:36.094 } 00:15:36.094 ] 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.094 "name": "Existed_Raid", 00:15:36.094 "uuid": "cbaaca4f-7c4d-4227-a9bf-dc794e7d1591", 00:15:36.094 "strip_size_kb": 64, 00:15:36.094 "state": "configuring", 00:15:36.094 "raid_level": "raid0", 00:15:36.094 "superblock": true, 00:15:36.094 "num_base_bdevs": 4, 00:15:36.094 "num_base_bdevs_discovered": 1, 00:15:36.094 "num_base_bdevs_operational": 4, 00:15:36.094 "base_bdevs_list": [ 00:15:36.094 { 00:15:36.094 "name": "BaseBdev1", 00:15:36.094 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:36.094 "is_configured": true, 00:15:36.094 "data_offset": 2048, 00:15:36.094 "data_size": 63488 00:15:36.094 }, 00:15:36.094 { 00:15:36.094 "name": "BaseBdev2", 00:15:36.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.094 "is_configured": false, 00:15:36.094 "data_offset": 0, 00:15:36.094 "data_size": 0 00:15:36.094 }, 00:15:36.094 { 00:15:36.094 "name": "BaseBdev3", 00:15:36.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.094 "is_configured": false, 00:15:36.094 "data_offset": 0, 00:15:36.094 "data_size": 0 00:15:36.094 }, 00:15:36.094 { 00:15:36.094 "name": "BaseBdev4", 00:15:36.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.094 "is_configured": false, 00:15:36.094 "data_offset": 0, 00:15:36.094 "data_size": 0 00:15:36.094 } 00:15:36.094 ] 00:15:36.094 }' 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.094 13:43:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:36.664 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:36.924 [2024-06-10 13:43:51.245793] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:36.924 [2024-06-10 13:43:51.245818] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c9100 name Existed_Raid, state configuring 00:15:36.924 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:37.185 [2024-06-10 13:43:51.450346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:37.185 [2024-06-10 13:43:51.451540] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:37.185 [2024-06-10 13:43:51.451569] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:37.185 [2024-06-10 13:43:51.451575] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:37.185 [2024-06-10 13:43:51.451581] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:37.185 [2024-06-10 13:43:51.451586] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:37.185 [2024-06-10 13:43:51.451592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.185 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.445 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.445 "name": "Existed_Raid", 00:15:37.445 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:37.445 "strip_size_kb": 64, 00:15:37.445 "state": "configuring", 00:15:37.445 "raid_level": "raid0", 00:15:37.445 "superblock": true, 00:15:37.445 "num_base_bdevs": 4, 00:15:37.445 "num_base_bdevs_discovered": 1, 00:15:37.445 "num_base_bdevs_operational": 4, 00:15:37.445 "base_bdevs_list": [ 00:15:37.445 { 00:15:37.445 "name": "BaseBdev1", 00:15:37.445 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:37.445 "is_configured": true, 00:15:37.445 "data_offset": 2048, 00:15:37.445 "data_size": 63488 00:15:37.445 }, 00:15:37.445 { 00:15:37.445 "name": "BaseBdev2", 00:15:37.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.445 "is_configured": false, 00:15:37.445 "data_offset": 0, 00:15:37.445 "data_size": 0 00:15:37.445 }, 00:15:37.445 { 00:15:37.445 "name": "BaseBdev3", 00:15:37.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.445 "is_configured": false, 00:15:37.445 "data_offset": 0, 00:15:37.445 "data_size": 0 00:15:37.445 }, 00:15:37.445 { 00:15:37.445 "name": "BaseBdev4", 00:15:37.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.445 "is_configured": false, 00:15:37.445 "data_offset": 0, 00:15:37.445 "data_size": 0 00:15:37.445 } 00:15:37.445 ] 00:15:37.445 }' 00:15:37.445 13:43:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.445 13:43:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:38.014 [2024-06-10 13:43:52.413895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:38.014 BaseBdev2 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:38.014 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:38.274 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:38.533 [ 00:15:38.533 { 00:15:38.533 "name": "BaseBdev2", 00:15:38.533 "aliases": [ 00:15:38.533 "21cc0390-d2d5-4287-8ca9-604d748f35f3" 00:15:38.533 ], 00:15:38.533 "product_name": "Malloc disk", 00:15:38.533 "block_size": 512, 00:15:38.533 "num_blocks": 65536, 00:15:38.533 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:38.533 "assigned_rate_limits": { 00:15:38.533 "rw_ios_per_sec": 0, 00:15:38.533 "rw_mbytes_per_sec": 0, 00:15:38.533 "r_mbytes_per_sec": 0, 00:15:38.533 "w_mbytes_per_sec": 0 00:15:38.533 }, 00:15:38.533 "claimed": true, 00:15:38.533 "claim_type": "exclusive_write", 00:15:38.533 "zoned": false, 00:15:38.533 "supported_io_types": { 00:15:38.533 "read": true, 00:15:38.533 "write": true, 00:15:38.533 "unmap": true, 00:15:38.533 "write_zeroes": true, 00:15:38.533 "flush": true, 00:15:38.533 "reset": true, 00:15:38.533 "compare": false, 00:15:38.533 "compare_and_write": false, 00:15:38.533 "abort": true, 00:15:38.533 "nvme_admin": false, 00:15:38.533 "nvme_io": false 00:15:38.533 }, 00:15:38.533 "memory_domains": [ 00:15:38.533 { 00:15:38.533 "dma_device_id": "system", 00:15:38.533 "dma_device_type": 1 00:15:38.533 }, 00:15:38.533 { 00:15:38.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.533 "dma_device_type": 2 00:15:38.533 } 00:15:38.533 ], 00:15:38.533 "driver_specific": {} 00:15:38.533 } 00:15:38.533 ] 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.533 13:43:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.793 13:43:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.793 "name": "Existed_Raid", 00:15:38.793 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:38.793 "strip_size_kb": 64, 00:15:38.793 "state": "configuring", 00:15:38.793 "raid_level": "raid0", 00:15:38.793 "superblock": true, 00:15:38.793 "num_base_bdevs": 4, 00:15:38.793 "num_base_bdevs_discovered": 2, 00:15:38.793 "num_base_bdevs_operational": 4, 00:15:38.793 "base_bdevs_list": [ 00:15:38.793 { 00:15:38.793 "name": "BaseBdev1", 00:15:38.793 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:38.793 "is_configured": true, 00:15:38.793 "data_offset": 2048, 00:15:38.793 "data_size": 63488 00:15:38.793 }, 00:15:38.793 { 00:15:38.793 "name": "BaseBdev2", 00:15:38.793 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:38.793 "is_configured": true, 00:15:38.793 "data_offset": 2048, 00:15:38.793 "data_size": 63488 00:15:38.793 }, 00:15:38.793 { 00:15:38.793 "name": "BaseBdev3", 00:15:38.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.793 "is_configured": false, 00:15:38.793 "data_offset": 0, 00:15:38.793 "data_size": 0 00:15:38.793 }, 00:15:38.793 { 00:15:38.793 "name": "BaseBdev4", 00:15:38.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.793 "is_configured": false, 00:15:38.793 "data_offset": 0, 00:15:38.793 "data_size": 0 00:15:38.793 } 00:15:38.793 ] 00:15:38.793 }' 00:15:38.793 13:43:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.793 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:39.363 [2024-06-10 13:43:53.778470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:39.363 BaseBdev3 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:39.363 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:39.622 13:43:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:39.882 [ 00:15:39.882 { 00:15:39.882 "name": "BaseBdev3", 00:15:39.882 "aliases": [ 00:15:39.882 "555a3444-65f0-4503-80f5-07788d7efb9c" 00:15:39.882 ], 00:15:39.882 "product_name": "Malloc disk", 00:15:39.882 "block_size": 512, 00:15:39.882 "num_blocks": 65536, 00:15:39.882 "uuid": "555a3444-65f0-4503-80f5-07788d7efb9c", 00:15:39.882 "assigned_rate_limits": { 00:15:39.882 "rw_ios_per_sec": 0, 00:15:39.882 "rw_mbytes_per_sec": 0, 00:15:39.882 "r_mbytes_per_sec": 0, 00:15:39.882 "w_mbytes_per_sec": 0 00:15:39.882 }, 00:15:39.882 "claimed": true, 00:15:39.882 "claim_type": "exclusive_write", 00:15:39.882 "zoned": false, 00:15:39.882 "supported_io_types": { 00:15:39.882 "read": true, 00:15:39.882 "write": true, 00:15:39.882 "unmap": true, 00:15:39.882 "write_zeroes": true, 00:15:39.882 "flush": true, 00:15:39.882 "reset": true, 00:15:39.882 "compare": false, 00:15:39.882 "compare_and_write": false, 00:15:39.882 "abort": true, 00:15:39.882 "nvme_admin": false, 00:15:39.882 "nvme_io": false 00:15:39.882 }, 00:15:39.882 "memory_domains": [ 00:15:39.882 { 00:15:39.882 "dma_device_id": "system", 00:15:39.882 "dma_device_type": 1 00:15:39.882 }, 00:15:39.882 { 00:15:39.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.882 "dma_device_type": 2 00:15:39.882 } 00:15:39.882 ], 00:15:39.882 "driver_specific": {} 00:15:39.882 } 00:15:39.882 ] 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.882 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.141 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.141 "name": "Existed_Raid", 00:15:40.141 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:40.141 "strip_size_kb": 64, 00:15:40.141 "state": "configuring", 00:15:40.141 "raid_level": "raid0", 00:15:40.141 "superblock": true, 00:15:40.141 "num_base_bdevs": 4, 00:15:40.141 "num_base_bdevs_discovered": 3, 00:15:40.141 "num_base_bdevs_operational": 4, 00:15:40.141 "base_bdevs_list": [ 00:15:40.141 { 00:15:40.142 "name": "BaseBdev1", 00:15:40.142 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:40.142 "is_configured": true, 00:15:40.142 "data_offset": 2048, 00:15:40.142 "data_size": 63488 00:15:40.142 }, 00:15:40.142 { 00:15:40.142 "name": "BaseBdev2", 00:15:40.142 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:40.142 "is_configured": true, 00:15:40.142 "data_offset": 2048, 00:15:40.142 "data_size": 63488 00:15:40.142 }, 00:15:40.142 { 00:15:40.142 "name": "BaseBdev3", 00:15:40.142 "uuid": "555a3444-65f0-4503-80f5-07788d7efb9c", 00:15:40.142 "is_configured": true, 00:15:40.142 "data_offset": 2048, 00:15:40.142 "data_size": 63488 00:15:40.142 }, 00:15:40.142 { 00:15:40.142 "name": "BaseBdev4", 00:15:40.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:40.142 "is_configured": false, 00:15:40.142 "data_offset": 0, 00:15:40.142 "data_size": 0 00:15:40.142 } 00:15:40.142 ] 00:15:40.142 }' 00:15:40.142 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.142 13:43:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:40.711 13:43:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:40.711 [2024-06-10 13:43:55.106968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:40.711 [2024-06-10 13:43:55.107092] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20ca160 00:15:40.711 [2024-06-10 13:43:55.107101] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:40.711 [2024-06-10 13:43:55.107256] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20b5f20 00:15:40.711 [2024-06-10 13:43:55.107358] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20ca160 00:15:40.711 [2024-06-10 13:43:55.107364] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20ca160 00:15:40.711 [2024-06-10 13:43:55.107439] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:40.711 BaseBdev4 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:40.711 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:40.970 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:41.229 [ 00:15:41.229 { 00:15:41.229 "name": "BaseBdev4", 00:15:41.229 "aliases": [ 00:15:41.229 "96a76f30-5235-4dcb-a100-5aebf414f6e7" 00:15:41.229 ], 00:15:41.229 "product_name": "Malloc disk", 00:15:41.229 "block_size": 512, 00:15:41.229 "num_blocks": 65536, 00:15:41.229 "uuid": "96a76f30-5235-4dcb-a100-5aebf414f6e7", 00:15:41.229 "assigned_rate_limits": { 00:15:41.229 "rw_ios_per_sec": 0, 00:15:41.229 "rw_mbytes_per_sec": 0, 00:15:41.229 "r_mbytes_per_sec": 0, 00:15:41.229 "w_mbytes_per_sec": 0 00:15:41.229 }, 00:15:41.229 "claimed": true, 00:15:41.229 "claim_type": "exclusive_write", 00:15:41.229 "zoned": false, 00:15:41.229 "supported_io_types": { 00:15:41.229 "read": true, 00:15:41.229 "write": true, 00:15:41.229 "unmap": true, 00:15:41.229 "write_zeroes": true, 00:15:41.229 "flush": true, 00:15:41.229 "reset": true, 00:15:41.229 "compare": false, 00:15:41.229 "compare_and_write": false, 00:15:41.229 "abort": true, 00:15:41.229 "nvme_admin": false, 00:15:41.229 "nvme_io": false 00:15:41.229 }, 00:15:41.229 "memory_domains": [ 00:15:41.229 { 00:15:41.229 "dma_device_id": "system", 00:15:41.229 "dma_device_type": 1 00:15:41.229 }, 00:15:41.229 { 00:15:41.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.229 "dma_device_type": 2 00:15:41.229 } 00:15:41.229 ], 00:15:41.229 "driver_specific": {} 00:15:41.229 } 00:15:41.229 ] 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.229 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:41.489 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.489 "name": "Existed_Raid", 00:15:41.489 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:41.489 "strip_size_kb": 64, 00:15:41.489 "state": "online", 00:15:41.489 "raid_level": "raid0", 00:15:41.489 "superblock": true, 00:15:41.489 "num_base_bdevs": 4, 00:15:41.489 "num_base_bdevs_discovered": 4, 00:15:41.489 "num_base_bdevs_operational": 4, 00:15:41.489 "base_bdevs_list": [ 00:15:41.489 { 00:15:41.489 "name": "BaseBdev1", 00:15:41.489 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:41.489 "is_configured": true, 00:15:41.489 "data_offset": 2048, 00:15:41.489 "data_size": 63488 00:15:41.489 }, 00:15:41.489 { 00:15:41.489 "name": "BaseBdev2", 00:15:41.489 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:41.489 "is_configured": true, 00:15:41.489 "data_offset": 2048, 00:15:41.489 "data_size": 63488 00:15:41.489 }, 00:15:41.489 { 00:15:41.489 "name": "BaseBdev3", 00:15:41.489 "uuid": "555a3444-65f0-4503-80f5-07788d7efb9c", 00:15:41.489 "is_configured": true, 00:15:41.489 "data_offset": 2048, 00:15:41.489 "data_size": 63488 00:15:41.489 }, 00:15:41.489 { 00:15:41.489 "name": "BaseBdev4", 00:15:41.489 "uuid": "96a76f30-5235-4dcb-a100-5aebf414f6e7", 00:15:41.489 "is_configured": true, 00:15:41.489 "data_offset": 2048, 00:15:41.489 "data_size": 63488 00:15:41.489 } 00:15:41.489 ] 00:15:41.489 }' 00:15:41.489 13:43:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.489 13:43:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:42.059 [2024-06-10 13:43:56.474694] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:42.059 "name": "Existed_Raid", 00:15:42.059 "aliases": [ 00:15:42.059 "eda50d02-8e92-45da-a5fe-abb8e2fe0499" 00:15:42.059 ], 00:15:42.059 "product_name": "Raid Volume", 00:15:42.059 "block_size": 512, 00:15:42.059 "num_blocks": 253952, 00:15:42.059 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:42.059 "assigned_rate_limits": { 00:15:42.059 "rw_ios_per_sec": 0, 00:15:42.059 "rw_mbytes_per_sec": 0, 00:15:42.059 "r_mbytes_per_sec": 0, 00:15:42.059 "w_mbytes_per_sec": 0 00:15:42.059 }, 00:15:42.059 "claimed": false, 00:15:42.059 "zoned": false, 00:15:42.059 "supported_io_types": { 00:15:42.059 "read": true, 00:15:42.059 "write": true, 00:15:42.059 "unmap": true, 00:15:42.059 "write_zeroes": true, 00:15:42.059 "flush": true, 00:15:42.059 "reset": true, 00:15:42.059 "compare": false, 00:15:42.059 "compare_and_write": false, 00:15:42.059 "abort": false, 00:15:42.059 "nvme_admin": false, 00:15:42.059 "nvme_io": false 00:15:42.059 }, 00:15:42.059 "memory_domains": [ 00:15:42.059 { 00:15:42.059 "dma_device_id": "system", 00:15:42.059 "dma_device_type": 1 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.059 "dma_device_type": 2 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "system", 00:15:42.059 "dma_device_type": 1 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.059 "dma_device_type": 2 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "system", 00:15:42.059 "dma_device_type": 1 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.059 "dma_device_type": 2 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "system", 00:15:42.059 "dma_device_type": 1 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.059 "dma_device_type": 2 00:15:42.059 } 00:15:42.059 ], 00:15:42.059 "driver_specific": { 00:15:42.059 "raid": { 00:15:42.059 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:42.059 "strip_size_kb": 64, 00:15:42.059 "state": "online", 00:15:42.059 "raid_level": "raid0", 00:15:42.059 "superblock": true, 00:15:42.059 "num_base_bdevs": 4, 00:15:42.059 "num_base_bdevs_discovered": 4, 00:15:42.059 "num_base_bdevs_operational": 4, 00:15:42.059 "base_bdevs_list": [ 00:15:42.059 { 00:15:42.059 "name": "BaseBdev1", 00:15:42.059 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:42.059 "is_configured": true, 00:15:42.059 "data_offset": 2048, 00:15:42.059 "data_size": 63488 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "name": "BaseBdev2", 00:15:42.059 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:42.059 "is_configured": true, 00:15:42.059 "data_offset": 2048, 00:15:42.059 "data_size": 63488 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "name": "BaseBdev3", 00:15:42.059 "uuid": "555a3444-65f0-4503-80f5-07788d7efb9c", 00:15:42.059 "is_configured": true, 00:15:42.059 "data_offset": 2048, 00:15:42.059 "data_size": 63488 00:15:42.059 }, 00:15:42.059 { 00:15:42.059 "name": "BaseBdev4", 00:15:42.059 "uuid": "96a76f30-5235-4dcb-a100-5aebf414f6e7", 00:15:42.059 "is_configured": true, 00:15:42.059 "data_offset": 2048, 00:15:42.059 "data_size": 63488 00:15:42.059 } 00:15:42.059 ] 00:15:42.059 } 00:15:42.059 } 00:15:42.059 }' 00:15:42.059 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:42.320 BaseBdev2 00:15:42.320 BaseBdev3 00:15:42.320 BaseBdev4' 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.320 "name": "BaseBdev1", 00:15:42.320 "aliases": [ 00:15:42.320 "8955d9ee-b081-4bfb-8114-469f3e6b948a" 00:15:42.320 ], 00:15:42.320 "product_name": "Malloc disk", 00:15:42.320 "block_size": 512, 00:15:42.320 "num_blocks": 65536, 00:15:42.320 "uuid": "8955d9ee-b081-4bfb-8114-469f3e6b948a", 00:15:42.320 "assigned_rate_limits": { 00:15:42.320 "rw_ios_per_sec": 0, 00:15:42.320 "rw_mbytes_per_sec": 0, 00:15:42.320 "r_mbytes_per_sec": 0, 00:15:42.320 "w_mbytes_per_sec": 0 00:15:42.320 }, 00:15:42.320 "claimed": true, 00:15:42.320 "claim_type": "exclusive_write", 00:15:42.320 "zoned": false, 00:15:42.320 "supported_io_types": { 00:15:42.320 "read": true, 00:15:42.320 "write": true, 00:15:42.320 "unmap": true, 00:15:42.320 "write_zeroes": true, 00:15:42.320 "flush": true, 00:15:42.320 "reset": true, 00:15:42.320 "compare": false, 00:15:42.320 "compare_and_write": false, 00:15:42.320 "abort": true, 00:15:42.320 "nvme_admin": false, 00:15:42.320 "nvme_io": false 00:15:42.320 }, 00:15:42.320 "memory_domains": [ 00:15:42.320 { 00:15:42.320 "dma_device_id": "system", 00:15:42.320 "dma_device_type": 1 00:15:42.320 }, 00:15:42.320 { 00:15:42.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.320 "dma_device_type": 2 00:15:42.320 } 00:15:42.320 ], 00:15:42.320 "driver_specific": {} 00:15:42.320 }' 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.320 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.587 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.587 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.587 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.587 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.587 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.587 13:43:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.587 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.587 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.850 "name": "BaseBdev2", 00:15:42.850 "aliases": [ 00:15:42.850 "21cc0390-d2d5-4287-8ca9-604d748f35f3" 00:15:42.850 ], 00:15:42.850 "product_name": "Malloc disk", 00:15:42.850 "block_size": 512, 00:15:42.850 "num_blocks": 65536, 00:15:42.850 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:42.850 "assigned_rate_limits": { 00:15:42.850 "rw_ios_per_sec": 0, 00:15:42.850 "rw_mbytes_per_sec": 0, 00:15:42.850 "r_mbytes_per_sec": 0, 00:15:42.850 "w_mbytes_per_sec": 0 00:15:42.850 }, 00:15:42.850 "claimed": true, 00:15:42.850 "claim_type": "exclusive_write", 00:15:42.850 "zoned": false, 00:15:42.850 "supported_io_types": { 00:15:42.850 "read": true, 00:15:42.850 "write": true, 00:15:42.850 "unmap": true, 00:15:42.850 "write_zeroes": true, 00:15:42.850 "flush": true, 00:15:42.850 "reset": true, 00:15:42.850 "compare": false, 00:15:42.850 "compare_and_write": false, 00:15:42.850 "abort": true, 00:15:42.850 "nvme_admin": false, 00:15:42.850 "nvme_io": false 00:15:42.850 }, 00:15:42.850 "memory_domains": [ 00:15:42.850 { 00:15:42.850 "dma_device_id": "system", 00:15:42.850 "dma_device_type": 1 00:15:42.850 }, 00:15:42.850 { 00:15:42.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.850 "dma_device_type": 2 00:15:42.850 } 00:15:42.850 ], 00:15:42.850 "driver_specific": {} 00:15:42.850 }' 00:15:42.850 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.111 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.372 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.372 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.372 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.372 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:43.372 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.372 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.372 "name": "BaseBdev3", 00:15:43.372 "aliases": [ 00:15:43.372 "555a3444-65f0-4503-80f5-07788d7efb9c" 00:15:43.372 ], 00:15:43.372 "product_name": "Malloc disk", 00:15:43.372 "block_size": 512, 00:15:43.372 "num_blocks": 65536, 00:15:43.372 "uuid": "555a3444-65f0-4503-80f5-07788d7efb9c", 00:15:43.372 "assigned_rate_limits": { 00:15:43.372 "rw_ios_per_sec": 0, 00:15:43.372 "rw_mbytes_per_sec": 0, 00:15:43.372 "r_mbytes_per_sec": 0, 00:15:43.372 "w_mbytes_per_sec": 0 00:15:43.372 }, 00:15:43.372 "claimed": true, 00:15:43.372 "claim_type": "exclusive_write", 00:15:43.372 "zoned": false, 00:15:43.372 "supported_io_types": { 00:15:43.372 "read": true, 00:15:43.372 "write": true, 00:15:43.372 "unmap": true, 00:15:43.372 "write_zeroes": true, 00:15:43.372 "flush": true, 00:15:43.372 "reset": true, 00:15:43.372 "compare": false, 00:15:43.373 "compare_and_write": false, 00:15:43.373 "abort": true, 00:15:43.373 "nvme_admin": false, 00:15:43.373 "nvme_io": false 00:15:43.373 }, 00:15:43.373 "memory_domains": [ 00:15:43.373 { 00:15:43.373 "dma_device_id": "system", 00:15:43.373 "dma_device_type": 1 00:15:43.373 }, 00:15:43.373 { 00:15:43.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.373 "dma_device_type": 2 00:15:43.373 } 00:15:43.373 ], 00:15:43.373 "driver_specific": {} 00:15:43.373 }' 00:15:43.633 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.633 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.633 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.633 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.633 13:43:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.633 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.633 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.633 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.633 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.633 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.893 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.893 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.893 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.893 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:43.893 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.154 "name": "BaseBdev4", 00:15:44.154 "aliases": [ 00:15:44.154 "96a76f30-5235-4dcb-a100-5aebf414f6e7" 00:15:44.154 ], 00:15:44.154 "product_name": "Malloc disk", 00:15:44.154 "block_size": 512, 00:15:44.154 "num_blocks": 65536, 00:15:44.154 "uuid": "96a76f30-5235-4dcb-a100-5aebf414f6e7", 00:15:44.154 "assigned_rate_limits": { 00:15:44.154 "rw_ios_per_sec": 0, 00:15:44.154 "rw_mbytes_per_sec": 0, 00:15:44.154 "r_mbytes_per_sec": 0, 00:15:44.154 "w_mbytes_per_sec": 0 00:15:44.154 }, 00:15:44.154 "claimed": true, 00:15:44.154 "claim_type": "exclusive_write", 00:15:44.154 "zoned": false, 00:15:44.154 "supported_io_types": { 00:15:44.154 "read": true, 00:15:44.154 "write": true, 00:15:44.154 "unmap": true, 00:15:44.154 "write_zeroes": true, 00:15:44.154 "flush": true, 00:15:44.154 "reset": true, 00:15:44.154 "compare": false, 00:15:44.154 "compare_and_write": false, 00:15:44.154 "abort": true, 00:15:44.154 "nvme_admin": false, 00:15:44.154 "nvme_io": false 00:15:44.154 }, 00:15:44.154 "memory_domains": [ 00:15:44.154 { 00:15:44.154 "dma_device_id": "system", 00:15:44.154 "dma_device_type": 1 00:15:44.154 }, 00:15:44.154 { 00:15:44.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.154 "dma_device_type": 2 00:15:44.154 } 00:15:44.154 ], 00:15:44.154 "driver_specific": {} 00:15:44.154 }' 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.154 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.414 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.414 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.414 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.414 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.414 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:44.675 [2024-06-10 13:43:58.928743] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:44.675 [2024-06-10 13:43:58.928763] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:44.675 [2024-06-10 13:43:58.928801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.675 13:43:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.675 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.675 "name": "Existed_Raid", 00:15:44.675 "uuid": "eda50d02-8e92-45da-a5fe-abb8e2fe0499", 00:15:44.675 "strip_size_kb": 64, 00:15:44.675 "state": "offline", 00:15:44.675 "raid_level": "raid0", 00:15:44.675 "superblock": true, 00:15:44.675 "num_base_bdevs": 4, 00:15:44.675 "num_base_bdevs_discovered": 3, 00:15:44.675 "num_base_bdevs_operational": 3, 00:15:44.675 "base_bdevs_list": [ 00:15:44.675 { 00:15:44.675 "name": null, 00:15:44.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.675 "is_configured": false, 00:15:44.675 "data_offset": 2048, 00:15:44.675 "data_size": 63488 00:15:44.675 }, 00:15:44.675 { 00:15:44.675 "name": "BaseBdev2", 00:15:44.675 "uuid": "21cc0390-d2d5-4287-8ca9-604d748f35f3", 00:15:44.675 "is_configured": true, 00:15:44.675 "data_offset": 2048, 00:15:44.675 "data_size": 63488 00:15:44.675 }, 00:15:44.675 { 00:15:44.675 "name": "BaseBdev3", 00:15:44.675 "uuid": "555a3444-65f0-4503-80f5-07788d7efb9c", 00:15:44.675 "is_configured": true, 00:15:44.675 "data_offset": 2048, 00:15:44.675 "data_size": 63488 00:15:44.675 }, 00:15:44.675 { 00:15:44.675 "name": "BaseBdev4", 00:15:44.675 "uuid": "96a76f30-5235-4dcb-a100-5aebf414f6e7", 00:15:44.675 "is_configured": true, 00:15:44.675 "data_offset": 2048, 00:15:44.675 "data_size": 63488 00:15:44.675 } 00:15:44.675 ] 00:15:44.675 }' 00:15:44.675 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.675 13:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:45.245 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:45.245 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:45.245 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.245 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:45.506 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:45.506 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:45.506 13:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:45.767 [2024-06-10 13:44:00.103742] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:45.767 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:45.767 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:45.767 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.767 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:46.027 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:46.027 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:46.027 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:46.287 [2024-06-10 13:44:00.502980] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:46.287 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:46.547 [2024-06-10 13:44:00.909965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:46.547 [2024-06-10 13:44:00.909997] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ca160 name Existed_Raid, state offline 00:15:46.547 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:46.547 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:46.547 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.547 13:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:46.807 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:46.807 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:46.807 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:46.807 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:46.807 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:46.807 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:47.067 BaseBdev2 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.068 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:47.328 [ 00:15:47.328 { 00:15:47.328 "name": "BaseBdev2", 00:15:47.328 "aliases": [ 00:15:47.328 "4b775420-4ac4-4900-bec1-05d76b31be72" 00:15:47.328 ], 00:15:47.328 "product_name": "Malloc disk", 00:15:47.328 "block_size": 512, 00:15:47.328 "num_blocks": 65536, 00:15:47.328 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:47.328 "assigned_rate_limits": { 00:15:47.328 "rw_ios_per_sec": 0, 00:15:47.328 "rw_mbytes_per_sec": 0, 00:15:47.328 "r_mbytes_per_sec": 0, 00:15:47.328 "w_mbytes_per_sec": 0 00:15:47.328 }, 00:15:47.328 "claimed": false, 00:15:47.328 "zoned": false, 00:15:47.328 "supported_io_types": { 00:15:47.328 "read": true, 00:15:47.328 "write": true, 00:15:47.328 "unmap": true, 00:15:47.328 "write_zeroes": true, 00:15:47.328 "flush": true, 00:15:47.328 "reset": true, 00:15:47.328 "compare": false, 00:15:47.328 "compare_and_write": false, 00:15:47.328 "abort": true, 00:15:47.328 "nvme_admin": false, 00:15:47.328 "nvme_io": false 00:15:47.328 }, 00:15:47.328 "memory_domains": [ 00:15:47.328 { 00:15:47.328 "dma_device_id": "system", 00:15:47.328 "dma_device_type": 1 00:15:47.328 }, 00:15:47.328 { 00:15:47.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.328 "dma_device_type": 2 00:15:47.328 } 00:15:47.328 ], 00:15:47.328 "driver_specific": {} 00:15:47.328 } 00:15:47.328 ] 00:15:47.328 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:47.328 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:47.328 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:47.328 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:47.589 BaseBdev3 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:47.589 13:44:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.850 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:48.110 [ 00:15:48.110 { 00:15:48.110 "name": "BaseBdev3", 00:15:48.110 "aliases": [ 00:15:48.110 "178b2e56-bafd-4958-9a92-e33bf77a264d" 00:15:48.110 ], 00:15:48.110 "product_name": "Malloc disk", 00:15:48.110 "block_size": 512, 00:15:48.110 "num_blocks": 65536, 00:15:48.110 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:48.110 "assigned_rate_limits": { 00:15:48.110 "rw_ios_per_sec": 0, 00:15:48.110 "rw_mbytes_per_sec": 0, 00:15:48.110 "r_mbytes_per_sec": 0, 00:15:48.110 "w_mbytes_per_sec": 0 00:15:48.110 }, 00:15:48.110 "claimed": false, 00:15:48.110 "zoned": false, 00:15:48.110 "supported_io_types": { 00:15:48.110 "read": true, 00:15:48.110 "write": true, 00:15:48.110 "unmap": true, 00:15:48.110 "write_zeroes": true, 00:15:48.110 "flush": true, 00:15:48.110 "reset": true, 00:15:48.110 "compare": false, 00:15:48.110 "compare_and_write": false, 00:15:48.110 "abort": true, 00:15:48.110 "nvme_admin": false, 00:15:48.110 "nvme_io": false 00:15:48.110 }, 00:15:48.110 "memory_domains": [ 00:15:48.110 { 00:15:48.110 "dma_device_id": "system", 00:15:48.110 "dma_device_type": 1 00:15:48.110 }, 00:15:48.110 { 00:15:48.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.110 "dma_device_type": 2 00:15:48.110 } 00:15:48.110 ], 00:15:48.110 "driver_specific": {} 00:15:48.110 } 00:15:48.110 ] 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:48.111 BaseBdev4 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:48.111 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:48.371 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:48.631 [ 00:15:48.631 { 00:15:48.631 "name": "BaseBdev4", 00:15:48.632 "aliases": [ 00:15:48.632 "ae114e5f-5314-4da4-b065-91f78b9acc0e" 00:15:48.632 ], 00:15:48.632 "product_name": "Malloc disk", 00:15:48.632 "block_size": 512, 00:15:48.632 "num_blocks": 65536, 00:15:48.632 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:48.632 "assigned_rate_limits": { 00:15:48.632 "rw_ios_per_sec": 0, 00:15:48.632 "rw_mbytes_per_sec": 0, 00:15:48.632 "r_mbytes_per_sec": 0, 00:15:48.632 "w_mbytes_per_sec": 0 00:15:48.632 }, 00:15:48.632 "claimed": false, 00:15:48.632 "zoned": false, 00:15:48.632 "supported_io_types": { 00:15:48.632 "read": true, 00:15:48.632 "write": true, 00:15:48.632 "unmap": true, 00:15:48.632 "write_zeroes": true, 00:15:48.632 "flush": true, 00:15:48.632 "reset": true, 00:15:48.632 "compare": false, 00:15:48.632 "compare_and_write": false, 00:15:48.632 "abort": true, 00:15:48.632 "nvme_admin": false, 00:15:48.632 "nvme_io": false 00:15:48.632 }, 00:15:48.632 "memory_domains": [ 00:15:48.632 { 00:15:48.632 "dma_device_id": "system", 00:15:48.632 "dma_device_type": 1 00:15:48.632 }, 00:15:48.632 { 00:15:48.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.632 "dma_device_type": 2 00:15:48.632 } 00:15:48.632 ], 00:15:48.632 "driver_specific": {} 00:15:48.632 } 00:15:48.632 ] 00:15:48.632 13:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:48.632 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:48.632 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:48.632 13:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:48.894 [2024-06-10 13:44:03.122082] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:48.894 [2024-06-10 13:44:03.122113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:48.894 [2024-06-10 13:44:03.122126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:48.894 [2024-06-10 13:44:03.123239] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.894 [2024-06-10 13:44:03.123274] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.894 "name": "Existed_Raid", 00:15:48.894 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:48.894 "strip_size_kb": 64, 00:15:48.894 "state": "configuring", 00:15:48.894 "raid_level": "raid0", 00:15:48.894 "superblock": true, 00:15:48.894 "num_base_bdevs": 4, 00:15:48.894 "num_base_bdevs_discovered": 3, 00:15:48.894 "num_base_bdevs_operational": 4, 00:15:48.894 "base_bdevs_list": [ 00:15:48.894 { 00:15:48.894 "name": "BaseBdev1", 00:15:48.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.894 "is_configured": false, 00:15:48.894 "data_offset": 0, 00:15:48.894 "data_size": 0 00:15:48.894 }, 00:15:48.894 { 00:15:48.894 "name": "BaseBdev2", 00:15:48.894 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:48.894 "is_configured": true, 00:15:48.894 "data_offset": 2048, 00:15:48.894 "data_size": 63488 00:15:48.894 }, 00:15:48.894 { 00:15:48.894 "name": "BaseBdev3", 00:15:48.894 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:48.894 "is_configured": true, 00:15:48.894 "data_offset": 2048, 00:15:48.894 "data_size": 63488 00:15:48.894 }, 00:15:48.894 { 00:15:48.894 "name": "BaseBdev4", 00:15:48.894 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:48.894 "is_configured": true, 00:15:48.894 "data_offset": 2048, 00:15:48.894 "data_size": 63488 00:15:48.894 } 00:15:48.894 ] 00:15:48.894 }' 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.894 13:44:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.467 13:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:49.727 [2024-06-10 13:44:04.076470] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.727 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.988 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.988 "name": "Existed_Raid", 00:15:49.988 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:49.988 "strip_size_kb": 64, 00:15:49.988 "state": "configuring", 00:15:49.988 "raid_level": "raid0", 00:15:49.988 "superblock": true, 00:15:49.988 "num_base_bdevs": 4, 00:15:49.988 "num_base_bdevs_discovered": 2, 00:15:49.988 "num_base_bdevs_operational": 4, 00:15:49.988 "base_bdevs_list": [ 00:15:49.988 { 00:15:49.988 "name": "BaseBdev1", 00:15:49.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.988 "is_configured": false, 00:15:49.988 "data_offset": 0, 00:15:49.988 "data_size": 0 00:15:49.988 }, 00:15:49.988 { 00:15:49.988 "name": null, 00:15:49.988 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:49.988 "is_configured": false, 00:15:49.988 "data_offset": 2048, 00:15:49.988 "data_size": 63488 00:15:49.988 }, 00:15:49.988 { 00:15:49.988 "name": "BaseBdev3", 00:15:49.988 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:49.988 "is_configured": true, 00:15:49.988 "data_offset": 2048, 00:15:49.988 "data_size": 63488 00:15:49.988 }, 00:15:49.988 { 00:15:49.988 "name": "BaseBdev4", 00:15:49.988 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:49.988 "is_configured": true, 00:15:49.988 "data_offset": 2048, 00:15:49.988 "data_size": 63488 00:15:49.988 } 00:15:49.988 ] 00:15:49.988 }' 00:15:49.988 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.988 13:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:50.559 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.559 13:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:50.819 [2024-06-10 13:44:05.260594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:50.819 BaseBdev1 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:50.819 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.079 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:51.339 [ 00:15:51.339 { 00:15:51.339 "name": "BaseBdev1", 00:15:51.340 "aliases": [ 00:15:51.340 "54c9005b-840d-44fa-9193-11ecdd042dec" 00:15:51.340 ], 00:15:51.340 "product_name": "Malloc disk", 00:15:51.340 "block_size": 512, 00:15:51.340 "num_blocks": 65536, 00:15:51.340 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:51.340 "assigned_rate_limits": { 00:15:51.340 "rw_ios_per_sec": 0, 00:15:51.340 "rw_mbytes_per_sec": 0, 00:15:51.340 "r_mbytes_per_sec": 0, 00:15:51.340 "w_mbytes_per_sec": 0 00:15:51.340 }, 00:15:51.340 "claimed": true, 00:15:51.340 "claim_type": "exclusive_write", 00:15:51.340 "zoned": false, 00:15:51.340 "supported_io_types": { 00:15:51.340 "read": true, 00:15:51.340 "write": true, 00:15:51.340 "unmap": true, 00:15:51.340 "write_zeroes": true, 00:15:51.340 "flush": true, 00:15:51.340 "reset": true, 00:15:51.340 "compare": false, 00:15:51.340 "compare_and_write": false, 00:15:51.340 "abort": true, 00:15:51.340 "nvme_admin": false, 00:15:51.340 "nvme_io": false 00:15:51.340 }, 00:15:51.340 "memory_domains": [ 00:15:51.340 { 00:15:51.340 "dma_device_id": "system", 00:15:51.340 "dma_device_type": 1 00:15:51.340 }, 00:15:51.340 { 00:15:51.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.340 "dma_device_type": 2 00:15:51.340 } 00:15:51.340 ], 00:15:51.340 "driver_specific": {} 00:15:51.340 } 00:15:51.340 ] 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.340 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.600 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.600 "name": "Existed_Raid", 00:15:51.600 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:51.600 "strip_size_kb": 64, 00:15:51.600 "state": "configuring", 00:15:51.600 "raid_level": "raid0", 00:15:51.600 "superblock": true, 00:15:51.600 "num_base_bdevs": 4, 00:15:51.600 "num_base_bdevs_discovered": 3, 00:15:51.600 "num_base_bdevs_operational": 4, 00:15:51.600 "base_bdevs_list": [ 00:15:51.600 { 00:15:51.600 "name": "BaseBdev1", 00:15:51.600 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:51.600 "is_configured": true, 00:15:51.600 "data_offset": 2048, 00:15:51.600 "data_size": 63488 00:15:51.600 }, 00:15:51.600 { 00:15:51.600 "name": null, 00:15:51.600 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:51.600 "is_configured": false, 00:15:51.600 "data_offset": 2048, 00:15:51.600 "data_size": 63488 00:15:51.600 }, 00:15:51.600 { 00:15:51.600 "name": "BaseBdev3", 00:15:51.600 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:51.600 "is_configured": true, 00:15:51.600 "data_offset": 2048, 00:15:51.600 "data_size": 63488 00:15:51.600 }, 00:15:51.600 { 00:15:51.600 "name": "BaseBdev4", 00:15:51.600 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:51.600 "is_configured": true, 00:15:51.600 "data_offset": 2048, 00:15:51.600 "data_size": 63488 00:15:51.600 } 00:15:51.600 ] 00:15:51.600 }' 00:15:51.600 13:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.600 13:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:52.171 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.171 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:52.171 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:52.171 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:52.431 [2024-06-10 13:44:06.812583] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.431 13:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.692 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.692 "name": "Existed_Raid", 00:15:52.692 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:52.692 "strip_size_kb": 64, 00:15:52.692 "state": "configuring", 00:15:52.692 "raid_level": "raid0", 00:15:52.692 "superblock": true, 00:15:52.692 "num_base_bdevs": 4, 00:15:52.692 "num_base_bdevs_discovered": 2, 00:15:52.692 "num_base_bdevs_operational": 4, 00:15:52.692 "base_bdevs_list": [ 00:15:52.692 { 00:15:52.692 "name": "BaseBdev1", 00:15:52.692 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:52.692 "is_configured": true, 00:15:52.692 "data_offset": 2048, 00:15:52.692 "data_size": 63488 00:15:52.692 }, 00:15:52.692 { 00:15:52.692 "name": null, 00:15:52.692 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:52.692 "is_configured": false, 00:15:52.692 "data_offset": 2048, 00:15:52.692 "data_size": 63488 00:15:52.692 }, 00:15:52.692 { 00:15:52.692 "name": null, 00:15:52.692 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:52.692 "is_configured": false, 00:15:52.692 "data_offset": 2048, 00:15:52.692 "data_size": 63488 00:15:52.692 }, 00:15:52.692 { 00:15:52.692 "name": "BaseBdev4", 00:15:52.692 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:52.692 "is_configured": true, 00:15:52.693 "data_offset": 2048, 00:15:52.693 "data_size": 63488 00:15:52.693 } 00:15:52.693 ] 00:15:52.693 }' 00:15:52.693 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.693 13:44:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:53.263 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.263 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:53.558 [2024-06-10 13:44:07.971555] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.558 13:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:53.839 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.839 "name": "Existed_Raid", 00:15:53.839 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:53.839 "strip_size_kb": 64, 00:15:53.839 "state": "configuring", 00:15:53.839 "raid_level": "raid0", 00:15:53.839 "superblock": true, 00:15:53.839 "num_base_bdevs": 4, 00:15:53.839 "num_base_bdevs_discovered": 3, 00:15:53.839 "num_base_bdevs_operational": 4, 00:15:53.839 "base_bdevs_list": [ 00:15:53.839 { 00:15:53.839 "name": "BaseBdev1", 00:15:53.839 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:53.839 "is_configured": true, 00:15:53.839 "data_offset": 2048, 00:15:53.839 "data_size": 63488 00:15:53.839 }, 00:15:53.839 { 00:15:53.839 "name": null, 00:15:53.839 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:53.839 "is_configured": false, 00:15:53.839 "data_offset": 2048, 00:15:53.839 "data_size": 63488 00:15:53.839 }, 00:15:53.839 { 00:15:53.839 "name": "BaseBdev3", 00:15:53.839 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:53.839 "is_configured": true, 00:15:53.839 "data_offset": 2048, 00:15:53.839 "data_size": 63488 00:15:53.839 }, 00:15:53.839 { 00:15:53.839 "name": "BaseBdev4", 00:15:53.839 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:53.839 "is_configured": true, 00:15:53.839 "data_offset": 2048, 00:15:53.839 "data_size": 63488 00:15:53.839 } 00:15:53.839 ] 00:15:53.839 }' 00:15:53.839 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.839 13:44:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.411 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.411 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:54.671 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:54.671 13:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:54.931 [2024-06-10 13:44:09.154587] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:54.931 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.931 "name": "Existed_Raid", 00:15:54.931 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:54.931 "strip_size_kb": 64, 00:15:54.931 "state": "configuring", 00:15:54.931 "raid_level": "raid0", 00:15:54.931 "superblock": true, 00:15:54.931 "num_base_bdevs": 4, 00:15:54.931 "num_base_bdevs_discovered": 2, 00:15:54.931 "num_base_bdevs_operational": 4, 00:15:54.931 "base_bdevs_list": [ 00:15:54.931 { 00:15:54.931 "name": null, 00:15:54.931 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:54.931 "is_configured": false, 00:15:54.931 "data_offset": 2048, 00:15:54.931 "data_size": 63488 00:15:54.931 }, 00:15:54.931 { 00:15:54.931 "name": null, 00:15:54.931 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:54.931 "is_configured": false, 00:15:54.931 "data_offset": 2048, 00:15:54.931 "data_size": 63488 00:15:54.931 }, 00:15:54.931 { 00:15:54.931 "name": "BaseBdev3", 00:15:54.931 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:54.931 "is_configured": true, 00:15:54.932 "data_offset": 2048, 00:15:54.932 "data_size": 63488 00:15:54.932 }, 00:15:54.932 { 00:15:54.932 "name": "BaseBdev4", 00:15:54.932 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:54.932 "is_configured": true, 00:15:54.932 "data_offset": 2048, 00:15:54.932 "data_size": 63488 00:15:54.932 } 00:15:54.932 ] 00:15:54.932 }' 00:15:54.932 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.932 13:44:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:55.502 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:55.502 13:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.762 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:55.762 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:56.022 [2024-06-10 13:44:10.287584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.023 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:56.283 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.283 "name": "Existed_Raid", 00:15:56.283 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:56.283 "strip_size_kb": 64, 00:15:56.283 "state": "configuring", 00:15:56.283 "raid_level": "raid0", 00:15:56.283 "superblock": true, 00:15:56.283 "num_base_bdevs": 4, 00:15:56.283 "num_base_bdevs_discovered": 3, 00:15:56.283 "num_base_bdevs_operational": 4, 00:15:56.283 "base_bdevs_list": [ 00:15:56.283 { 00:15:56.283 "name": null, 00:15:56.283 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:56.283 "is_configured": false, 00:15:56.283 "data_offset": 2048, 00:15:56.283 "data_size": 63488 00:15:56.283 }, 00:15:56.283 { 00:15:56.283 "name": "BaseBdev2", 00:15:56.283 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:56.283 "is_configured": true, 00:15:56.283 "data_offset": 2048, 00:15:56.283 "data_size": 63488 00:15:56.283 }, 00:15:56.283 { 00:15:56.283 "name": "BaseBdev3", 00:15:56.283 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:56.283 "is_configured": true, 00:15:56.283 "data_offset": 2048, 00:15:56.283 "data_size": 63488 00:15:56.283 }, 00:15:56.283 { 00:15:56.283 "name": "BaseBdev4", 00:15:56.283 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:56.283 "is_configured": true, 00:15:56.283 "data_offset": 2048, 00:15:56.283 "data_size": 63488 00:15:56.283 } 00:15:56.283 ] 00:15:56.283 }' 00:15:56.283 13:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.283 13:44:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.852 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.852 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:56.852 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:56.852 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.852 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:57.112 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 54c9005b-840d-44fa-9193-11ecdd042dec 00:15:57.373 [2024-06-10 13:44:11.652226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:57.373 [2024-06-10 13:44:11.652345] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20ce8a0 00:15:57.373 [2024-06-10 13:44:11.652353] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:57.373 [2024-06-10 13:44:11.652504] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c9840 00:15:57.373 [2024-06-10 13:44:11.652599] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20ce8a0 00:15:57.373 [2024-06-10 13:44:11.652605] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20ce8a0 00:15:57.373 [2024-06-10 13:44:11.652675] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.373 NewBaseBdev 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:57.373 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.634 13:44:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:57.634 [ 00:15:57.634 { 00:15:57.634 "name": "NewBaseBdev", 00:15:57.634 "aliases": [ 00:15:57.634 "54c9005b-840d-44fa-9193-11ecdd042dec" 00:15:57.634 ], 00:15:57.634 "product_name": "Malloc disk", 00:15:57.634 "block_size": 512, 00:15:57.634 "num_blocks": 65536, 00:15:57.634 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:57.634 "assigned_rate_limits": { 00:15:57.634 "rw_ios_per_sec": 0, 00:15:57.634 "rw_mbytes_per_sec": 0, 00:15:57.634 "r_mbytes_per_sec": 0, 00:15:57.634 "w_mbytes_per_sec": 0 00:15:57.634 }, 00:15:57.634 "claimed": true, 00:15:57.634 "claim_type": "exclusive_write", 00:15:57.634 "zoned": false, 00:15:57.634 "supported_io_types": { 00:15:57.634 "read": true, 00:15:57.634 "write": true, 00:15:57.634 "unmap": true, 00:15:57.634 "write_zeroes": true, 00:15:57.634 "flush": true, 00:15:57.634 "reset": true, 00:15:57.634 "compare": false, 00:15:57.634 "compare_and_write": false, 00:15:57.634 "abort": true, 00:15:57.634 "nvme_admin": false, 00:15:57.634 "nvme_io": false 00:15:57.634 }, 00:15:57.634 "memory_domains": [ 00:15:57.634 { 00:15:57.634 "dma_device_id": "system", 00:15:57.634 "dma_device_type": 1 00:15:57.634 }, 00:15:57.634 { 00:15:57.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.634 "dma_device_type": 2 00:15:57.634 } 00:15:57.634 ], 00:15:57.634 "driver_specific": {} 00:15:57.634 } 00:15:57.634 ] 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.634 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.894 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.894 "name": "Existed_Raid", 00:15:57.894 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:57.894 "strip_size_kb": 64, 00:15:57.894 "state": "online", 00:15:57.894 "raid_level": "raid0", 00:15:57.894 "superblock": true, 00:15:57.894 "num_base_bdevs": 4, 00:15:57.894 "num_base_bdevs_discovered": 4, 00:15:57.894 "num_base_bdevs_operational": 4, 00:15:57.894 "base_bdevs_list": [ 00:15:57.894 { 00:15:57.894 "name": "NewBaseBdev", 00:15:57.894 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:57.894 "is_configured": true, 00:15:57.894 "data_offset": 2048, 00:15:57.894 "data_size": 63488 00:15:57.894 }, 00:15:57.894 { 00:15:57.894 "name": "BaseBdev2", 00:15:57.894 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:57.894 "is_configured": true, 00:15:57.894 "data_offset": 2048, 00:15:57.894 "data_size": 63488 00:15:57.894 }, 00:15:57.894 { 00:15:57.894 "name": "BaseBdev3", 00:15:57.894 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:57.894 "is_configured": true, 00:15:57.894 "data_offset": 2048, 00:15:57.894 "data_size": 63488 00:15:57.894 }, 00:15:57.894 { 00:15:57.894 "name": "BaseBdev4", 00:15:57.894 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:57.894 "is_configured": true, 00:15:57.894 "data_offset": 2048, 00:15:57.894 "data_size": 63488 00:15:57.894 } 00:15:57.894 ] 00:15:57.894 }' 00:15:57.894 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.894 13:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:58.465 13:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:58.726 [2024-06-10 13:44:13.003908] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:58.726 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:58.726 "name": "Existed_Raid", 00:15:58.726 "aliases": [ 00:15:58.726 "c262e48d-d08a-40c6-8fd4-e014b5c055f9" 00:15:58.726 ], 00:15:58.726 "product_name": "Raid Volume", 00:15:58.726 "block_size": 512, 00:15:58.726 "num_blocks": 253952, 00:15:58.726 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:58.726 "assigned_rate_limits": { 00:15:58.726 "rw_ios_per_sec": 0, 00:15:58.726 "rw_mbytes_per_sec": 0, 00:15:58.726 "r_mbytes_per_sec": 0, 00:15:58.726 "w_mbytes_per_sec": 0 00:15:58.726 }, 00:15:58.726 "claimed": false, 00:15:58.726 "zoned": false, 00:15:58.726 "supported_io_types": { 00:15:58.726 "read": true, 00:15:58.726 "write": true, 00:15:58.726 "unmap": true, 00:15:58.726 "write_zeroes": true, 00:15:58.726 "flush": true, 00:15:58.726 "reset": true, 00:15:58.726 "compare": false, 00:15:58.726 "compare_and_write": false, 00:15:58.726 "abort": false, 00:15:58.726 "nvme_admin": false, 00:15:58.726 "nvme_io": false 00:15:58.726 }, 00:15:58.726 "memory_domains": [ 00:15:58.726 { 00:15:58.726 "dma_device_id": "system", 00:15:58.726 "dma_device_type": 1 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.726 "dma_device_type": 2 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "system", 00:15:58.726 "dma_device_type": 1 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.726 "dma_device_type": 2 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "system", 00:15:58.726 "dma_device_type": 1 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.726 "dma_device_type": 2 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "system", 00:15:58.726 "dma_device_type": 1 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.726 "dma_device_type": 2 00:15:58.726 } 00:15:58.726 ], 00:15:58.726 "driver_specific": { 00:15:58.726 "raid": { 00:15:58.726 "uuid": "c262e48d-d08a-40c6-8fd4-e014b5c055f9", 00:15:58.726 "strip_size_kb": 64, 00:15:58.726 "state": "online", 00:15:58.726 "raid_level": "raid0", 00:15:58.726 "superblock": true, 00:15:58.726 "num_base_bdevs": 4, 00:15:58.726 "num_base_bdevs_discovered": 4, 00:15:58.726 "num_base_bdevs_operational": 4, 00:15:58.726 "base_bdevs_list": [ 00:15:58.726 { 00:15:58.726 "name": "NewBaseBdev", 00:15:58.726 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:58.726 "is_configured": true, 00:15:58.726 "data_offset": 2048, 00:15:58.726 "data_size": 63488 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "name": "BaseBdev2", 00:15:58.726 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:58.726 "is_configured": true, 00:15:58.726 "data_offset": 2048, 00:15:58.726 "data_size": 63488 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "name": "BaseBdev3", 00:15:58.726 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:15:58.726 "is_configured": true, 00:15:58.726 "data_offset": 2048, 00:15:58.726 "data_size": 63488 00:15:58.726 }, 00:15:58.726 { 00:15:58.726 "name": "BaseBdev4", 00:15:58.726 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:15:58.726 "is_configured": true, 00:15:58.726 "data_offset": 2048, 00:15:58.726 "data_size": 63488 00:15:58.726 } 00:15:58.726 ] 00:15:58.726 } 00:15:58.726 } 00:15:58.726 }' 00:15:58.726 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:58.726 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:58.726 BaseBdev2 00:15:58.726 BaseBdev3 00:15:58.726 BaseBdev4' 00:15:58.726 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:58.726 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:58.726 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:58.987 "name": "NewBaseBdev", 00:15:58.987 "aliases": [ 00:15:58.987 "54c9005b-840d-44fa-9193-11ecdd042dec" 00:15:58.987 ], 00:15:58.987 "product_name": "Malloc disk", 00:15:58.987 "block_size": 512, 00:15:58.987 "num_blocks": 65536, 00:15:58.987 "uuid": "54c9005b-840d-44fa-9193-11ecdd042dec", 00:15:58.987 "assigned_rate_limits": { 00:15:58.987 "rw_ios_per_sec": 0, 00:15:58.987 "rw_mbytes_per_sec": 0, 00:15:58.987 "r_mbytes_per_sec": 0, 00:15:58.987 "w_mbytes_per_sec": 0 00:15:58.987 }, 00:15:58.987 "claimed": true, 00:15:58.987 "claim_type": "exclusive_write", 00:15:58.987 "zoned": false, 00:15:58.987 "supported_io_types": { 00:15:58.987 "read": true, 00:15:58.987 "write": true, 00:15:58.987 "unmap": true, 00:15:58.987 "write_zeroes": true, 00:15:58.987 "flush": true, 00:15:58.987 "reset": true, 00:15:58.987 "compare": false, 00:15:58.987 "compare_and_write": false, 00:15:58.987 "abort": true, 00:15:58.987 "nvme_admin": false, 00:15:58.987 "nvme_io": false 00:15:58.987 }, 00:15:58.987 "memory_domains": [ 00:15:58.987 { 00:15:58.987 "dma_device_id": "system", 00:15:58.987 "dma_device_type": 1 00:15:58.987 }, 00:15:58.987 { 00:15:58.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.987 "dma_device_type": 2 00:15:58.987 } 00:15:58.987 ], 00:15:58.987 "driver_specific": {} 00:15:58.987 }' 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:58.987 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:59.247 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:59.507 "name": "BaseBdev2", 00:15:59.507 "aliases": [ 00:15:59.507 "4b775420-4ac4-4900-bec1-05d76b31be72" 00:15:59.507 ], 00:15:59.507 "product_name": "Malloc disk", 00:15:59.507 "block_size": 512, 00:15:59.507 "num_blocks": 65536, 00:15:59.507 "uuid": "4b775420-4ac4-4900-bec1-05d76b31be72", 00:15:59.507 "assigned_rate_limits": { 00:15:59.507 "rw_ios_per_sec": 0, 00:15:59.507 "rw_mbytes_per_sec": 0, 00:15:59.507 "r_mbytes_per_sec": 0, 00:15:59.507 "w_mbytes_per_sec": 0 00:15:59.507 }, 00:15:59.507 "claimed": true, 00:15:59.507 "claim_type": "exclusive_write", 00:15:59.507 "zoned": false, 00:15:59.507 "supported_io_types": { 00:15:59.507 "read": true, 00:15:59.507 "write": true, 00:15:59.507 "unmap": true, 00:15:59.507 "write_zeroes": true, 00:15:59.507 "flush": true, 00:15:59.507 "reset": true, 00:15:59.507 "compare": false, 00:15:59.507 "compare_and_write": false, 00:15:59.507 "abort": true, 00:15:59.507 "nvme_admin": false, 00:15:59.507 "nvme_io": false 00:15:59.507 }, 00:15:59.507 "memory_domains": [ 00:15:59.507 { 00:15:59.507 "dma_device_id": "system", 00:15:59.507 "dma_device_type": 1 00:15:59.507 }, 00:15:59.507 { 00:15:59.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.507 "dma_device_type": 2 00:15:59.507 } 00:15:59.507 ], 00:15:59.507 "driver_specific": {} 00:15:59.507 }' 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:59.507 13:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:59.768 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.028 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.028 "name": "BaseBdev3", 00:16:00.028 "aliases": [ 00:16:00.028 "178b2e56-bafd-4958-9a92-e33bf77a264d" 00:16:00.028 ], 00:16:00.028 "product_name": "Malloc disk", 00:16:00.028 "block_size": 512, 00:16:00.028 "num_blocks": 65536, 00:16:00.028 "uuid": "178b2e56-bafd-4958-9a92-e33bf77a264d", 00:16:00.028 "assigned_rate_limits": { 00:16:00.028 "rw_ios_per_sec": 0, 00:16:00.028 "rw_mbytes_per_sec": 0, 00:16:00.028 "r_mbytes_per_sec": 0, 00:16:00.028 "w_mbytes_per_sec": 0 00:16:00.028 }, 00:16:00.028 "claimed": true, 00:16:00.028 "claim_type": "exclusive_write", 00:16:00.028 "zoned": false, 00:16:00.028 "supported_io_types": { 00:16:00.028 "read": true, 00:16:00.028 "write": true, 00:16:00.028 "unmap": true, 00:16:00.028 "write_zeroes": true, 00:16:00.028 "flush": true, 00:16:00.028 "reset": true, 00:16:00.028 "compare": false, 00:16:00.028 "compare_and_write": false, 00:16:00.028 "abort": true, 00:16:00.028 "nvme_admin": false, 00:16:00.028 "nvme_io": false 00:16:00.028 }, 00:16:00.028 "memory_domains": [ 00:16:00.028 { 00:16:00.028 "dma_device_id": "system", 00:16:00.028 "dma_device_type": 1 00:16:00.028 }, 00:16:00.028 { 00:16:00.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.028 "dma_device_type": 2 00:16:00.028 } 00:16:00.028 ], 00:16:00.028 "driver_specific": {} 00:16:00.028 }' 00:16:00.028 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.028 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.028 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:00.028 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.028 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:00.288 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.548 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.548 "name": "BaseBdev4", 00:16:00.548 "aliases": [ 00:16:00.548 "ae114e5f-5314-4da4-b065-91f78b9acc0e" 00:16:00.548 ], 00:16:00.548 "product_name": "Malloc disk", 00:16:00.548 "block_size": 512, 00:16:00.548 "num_blocks": 65536, 00:16:00.548 "uuid": "ae114e5f-5314-4da4-b065-91f78b9acc0e", 00:16:00.548 "assigned_rate_limits": { 00:16:00.548 "rw_ios_per_sec": 0, 00:16:00.548 "rw_mbytes_per_sec": 0, 00:16:00.548 "r_mbytes_per_sec": 0, 00:16:00.548 "w_mbytes_per_sec": 0 00:16:00.548 }, 00:16:00.548 "claimed": true, 00:16:00.548 "claim_type": "exclusive_write", 00:16:00.548 "zoned": false, 00:16:00.548 "supported_io_types": { 00:16:00.548 "read": true, 00:16:00.548 "write": true, 00:16:00.548 "unmap": true, 00:16:00.548 "write_zeroes": true, 00:16:00.548 "flush": true, 00:16:00.548 "reset": true, 00:16:00.548 "compare": false, 00:16:00.548 "compare_and_write": false, 00:16:00.548 "abort": true, 00:16:00.548 "nvme_admin": false, 00:16:00.548 "nvme_io": false 00:16:00.548 }, 00:16:00.548 "memory_domains": [ 00:16:00.548 { 00:16:00.548 "dma_device_id": "system", 00:16:00.548 "dma_device_type": 1 00:16:00.548 }, 00:16:00.548 { 00:16:00.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.548 "dma_device_type": 2 00:16:00.548 } 00:16:00.548 ], 00:16:00.548 "driver_specific": {} 00:16:00.548 }' 00:16:00.548 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.548 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.548 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:00.548 13:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:00.809 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:01.069 [2024-06-10 13:44:15.441854] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:01.069 [2024-06-10 13:44:15.441873] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:01.069 [2024-06-10 13:44:15.441915] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:01.069 [2024-06-10 13:44:15.441965] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:01.069 [2024-06-10 13:44:15.441971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ce8a0 name Existed_Raid, state offline 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1569570 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1569570 ']' 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1569570 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1569570 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1569570' 00:16:01.069 killing process with pid 1569570 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1569570 00:16:01.069 [2024-06-10 13:44:15.509067] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:01.069 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1569570 00:16:01.069 [2024-06-10 13:44:15.530575] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:01.330 13:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:01.330 00:16:01.330 real 0m28.203s 00:16:01.330 user 0m52.842s 00:16:01.330 sys 0m4.156s 00:16:01.330 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:01.330 13:44:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.330 ************************************ 00:16:01.330 END TEST raid_state_function_test_sb 00:16:01.330 ************************************ 00:16:01.330 13:44:15 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:16:01.330 13:44:15 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:16:01.330 13:44:15 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:01.330 13:44:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:01.330 ************************************ 00:16:01.330 START TEST raid_superblock_test 00:16:01.330 ************************************ 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 4 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1575845 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1575845 /var/tmp/spdk-raid.sock 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1575845 ']' 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:01.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:01.330 13:44:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.330 [2024-06-10 13:44:15.789259] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:16:01.330 [2024-06-10 13:44:15.789309] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1575845 ] 00:16:01.591 [2024-06-10 13:44:15.879965] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:01.591 [2024-06-10 13:44:15.946411] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:16:01.591 [2024-06-10 13:44:15.988404] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:01.591 [2024-06-10 13:44:15.988428] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:02.531 malloc1 00:16:02.531 13:44:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:02.792 [2024-06-10 13:44:17.031505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:02.792 [2024-06-10 13:44:17.031542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:02.792 [2024-06-10 13:44:17.031555] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1042550 00:16:02.792 [2024-06-10 13:44:17.031562] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:02.792 [2024-06-10 13:44:17.032935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:02.792 [2024-06-10 13:44:17.032956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:02.792 pt1 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:02.792 malloc2 00:16:02.792 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:03.052 [2024-06-10 13:44:17.442740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:03.052 [2024-06-10 13:44:17.442773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.052 [2024-06-10 13:44:17.442784] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11040f0 00:16:03.052 [2024-06-10 13:44:17.442791] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.052 [2024-06-10 13:44:17.444046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.052 [2024-06-10 13:44:17.444067] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:03.052 pt2 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:03.052 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:03.312 malloc3 00:16:03.312 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:03.572 [2024-06-10 13:44:17.845865] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:03.572 [2024-06-10 13:44:17.845900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.572 [2024-06-10 13:44:17.845912] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11055b0 00:16:03.572 [2024-06-10 13:44:17.845920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.572 [2024-06-10 13:44:17.847242] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.572 [2024-06-10 13:44:17.847263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:03.572 pt3 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:03.572 13:44:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:03.832 malloc4 00:16:03.832 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:03.832 [2024-06-10 13:44:18.240900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:03.832 [2024-06-10 13:44:18.240929] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.832 [2024-06-10 13:44:18.240941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1105d90 00:16:03.832 [2024-06-10 13:44:18.240948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.832 [2024-06-10 13:44:18.242203] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.832 [2024-06-10 13:44:18.242223] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:03.832 pt4 00:16:03.832 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:03.832 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:03.832 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:04.092 [2024-06-10 13:44:18.441423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:04.092 [2024-06-10 13:44:18.442491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:04.092 [2024-06-10 13:44:18.442536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:04.092 [2024-06-10 13:44:18.442572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:04.092 [2024-06-10 13:44:18.442715] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x103bb60 00:16:04.092 [2024-06-10 13:44:18.442727] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:04.092 [2024-06-10 13:44:18.442885] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110e920 00:16:04.092 [2024-06-10 13:44:18.443000] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103bb60 00:16:04.092 [2024-06-10 13:44:18.443007] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103bb60 00:16:04.092 [2024-06-10 13:44:18.443084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.092 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:04.353 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.353 "name": "raid_bdev1", 00:16:04.353 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:04.353 "strip_size_kb": 64, 00:16:04.353 "state": "online", 00:16:04.353 "raid_level": "raid0", 00:16:04.353 "superblock": true, 00:16:04.353 "num_base_bdevs": 4, 00:16:04.353 "num_base_bdevs_discovered": 4, 00:16:04.353 "num_base_bdevs_operational": 4, 00:16:04.353 "base_bdevs_list": [ 00:16:04.353 { 00:16:04.353 "name": "pt1", 00:16:04.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:04.353 "is_configured": true, 00:16:04.353 "data_offset": 2048, 00:16:04.353 "data_size": 63488 00:16:04.353 }, 00:16:04.353 { 00:16:04.353 "name": "pt2", 00:16:04.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:04.353 "is_configured": true, 00:16:04.353 "data_offset": 2048, 00:16:04.353 "data_size": 63488 00:16:04.353 }, 00:16:04.353 { 00:16:04.353 "name": "pt3", 00:16:04.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:04.353 "is_configured": true, 00:16:04.353 "data_offset": 2048, 00:16:04.353 "data_size": 63488 00:16:04.353 }, 00:16:04.353 { 00:16:04.353 "name": "pt4", 00:16:04.353 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:04.353 "is_configured": true, 00:16:04.353 "data_offset": 2048, 00:16:04.353 "data_size": 63488 00:16:04.353 } 00:16:04.353 ] 00:16:04.353 }' 00:16:04.353 13:44:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.353 13:44:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:04.924 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:04.924 [2024-06-10 13:44:19.388043] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:05.185 "name": "raid_bdev1", 00:16:05.185 "aliases": [ 00:16:05.185 "87a622ae-79c7-4610-8868-443c98bae0ba" 00:16:05.185 ], 00:16:05.185 "product_name": "Raid Volume", 00:16:05.185 "block_size": 512, 00:16:05.185 "num_blocks": 253952, 00:16:05.185 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:05.185 "assigned_rate_limits": { 00:16:05.185 "rw_ios_per_sec": 0, 00:16:05.185 "rw_mbytes_per_sec": 0, 00:16:05.185 "r_mbytes_per_sec": 0, 00:16:05.185 "w_mbytes_per_sec": 0 00:16:05.185 }, 00:16:05.185 "claimed": false, 00:16:05.185 "zoned": false, 00:16:05.185 "supported_io_types": { 00:16:05.185 "read": true, 00:16:05.185 "write": true, 00:16:05.185 "unmap": true, 00:16:05.185 "write_zeroes": true, 00:16:05.185 "flush": true, 00:16:05.185 "reset": true, 00:16:05.185 "compare": false, 00:16:05.185 "compare_and_write": false, 00:16:05.185 "abort": false, 00:16:05.185 "nvme_admin": false, 00:16:05.185 "nvme_io": false 00:16:05.185 }, 00:16:05.185 "memory_domains": [ 00:16:05.185 { 00:16:05.185 "dma_device_id": "system", 00:16:05.185 "dma_device_type": 1 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.185 "dma_device_type": 2 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "system", 00:16:05.185 "dma_device_type": 1 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.185 "dma_device_type": 2 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "system", 00:16:05.185 "dma_device_type": 1 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.185 "dma_device_type": 2 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "system", 00:16:05.185 "dma_device_type": 1 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.185 "dma_device_type": 2 00:16:05.185 } 00:16:05.185 ], 00:16:05.185 "driver_specific": { 00:16:05.185 "raid": { 00:16:05.185 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:05.185 "strip_size_kb": 64, 00:16:05.185 "state": "online", 00:16:05.185 "raid_level": "raid0", 00:16:05.185 "superblock": true, 00:16:05.185 "num_base_bdevs": 4, 00:16:05.185 "num_base_bdevs_discovered": 4, 00:16:05.185 "num_base_bdevs_operational": 4, 00:16:05.185 "base_bdevs_list": [ 00:16:05.185 { 00:16:05.185 "name": "pt1", 00:16:05.185 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:05.185 "is_configured": true, 00:16:05.185 "data_offset": 2048, 00:16:05.185 "data_size": 63488 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "name": "pt2", 00:16:05.185 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:05.185 "is_configured": true, 00:16:05.185 "data_offset": 2048, 00:16:05.185 "data_size": 63488 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "name": "pt3", 00:16:05.185 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:05.185 "is_configured": true, 00:16:05.185 "data_offset": 2048, 00:16:05.185 "data_size": 63488 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "name": "pt4", 00:16:05.185 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:05.185 "is_configured": true, 00:16:05.185 "data_offset": 2048, 00:16:05.185 "data_size": 63488 00:16:05.185 } 00:16:05.185 ] 00:16:05.185 } 00:16:05.185 } 00:16:05.185 }' 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:05.185 pt2 00:16:05.185 pt3 00:16:05.185 pt4' 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.185 "name": "pt1", 00:16:05.185 "aliases": [ 00:16:05.185 "00000000-0000-0000-0000-000000000001" 00:16:05.185 ], 00:16:05.185 "product_name": "passthru", 00:16:05.185 "block_size": 512, 00:16:05.185 "num_blocks": 65536, 00:16:05.185 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:05.185 "assigned_rate_limits": { 00:16:05.185 "rw_ios_per_sec": 0, 00:16:05.185 "rw_mbytes_per_sec": 0, 00:16:05.185 "r_mbytes_per_sec": 0, 00:16:05.185 "w_mbytes_per_sec": 0 00:16:05.185 }, 00:16:05.185 "claimed": true, 00:16:05.185 "claim_type": "exclusive_write", 00:16:05.185 "zoned": false, 00:16:05.185 "supported_io_types": { 00:16:05.185 "read": true, 00:16:05.185 "write": true, 00:16:05.185 "unmap": true, 00:16:05.185 "write_zeroes": true, 00:16:05.185 "flush": true, 00:16:05.185 "reset": true, 00:16:05.185 "compare": false, 00:16:05.185 "compare_and_write": false, 00:16:05.185 "abort": true, 00:16:05.185 "nvme_admin": false, 00:16:05.185 "nvme_io": false 00:16:05.185 }, 00:16:05.185 "memory_domains": [ 00:16:05.185 { 00:16:05.185 "dma_device_id": "system", 00:16:05.185 "dma_device_type": 1 00:16:05.185 }, 00:16:05.185 { 00:16:05.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.185 "dma_device_type": 2 00:16:05.185 } 00:16:05.185 ], 00:16:05.185 "driver_specific": { 00:16:05.185 "passthru": { 00:16:05.185 "name": "pt1", 00:16:05.185 "base_bdev_name": "malloc1" 00:16:05.185 } 00:16:05.185 } 00:16:05.185 }' 00:16:05.185 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.446 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.706 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.706 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.706 13:44:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.706 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:05.706 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.706 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:05.706 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.706 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.706 "name": "pt2", 00:16:05.706 "aliases": [ 00:16:05.706 "00000000-0000-0000-0000-000000000002" 00:16:05.706 ], 00:16:05.706 "product_name": "passthru", 00:16:05.706 "block_size": 512, 00:16:05.706 "num_blocks": 65536, 00:16:05.706 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:05.706 "assigned_rate_limits": { 00:16:05.706 "rw_ios_per_sec": 0, 00:16:05.706 "rw_mbytes_per_sec": 0, 00:16:05.706 "r_mbytes_per_sec": 0, 00:16:05.706 "w_mbytes_per_sec": 0 00:16:05.706 }, 00:16:05.706 "claimed": true, 00:16:05.706 "claim_type": "exclusive_write", 00:16:05.706 "zoned": false, 00:16:05.706 "supported_io_types": { 00:16:05.706 "read": true, 00:16:05.706 "write": true, 00:16:05.706 "unmap": true, 00:16:05.706 "write_zeroes": true, 00:16:05.706 "flush": true, 00:16:05.706 "reset": true, 00:16:05.706 "compare": false, 00:16:05.706 "compare_and_write": false, 00:16:05.706 "abort": true, 00:16:05.706 "nvme_admin": false, 00:16:05.706 "nvme_io": false 00:16:05.706 }, 00:16:05.706 "memory_domains": [ 00:16:05.706 { 00:16:05.706 "dma_device_id": "system", 00:16:05.706 "dma_device_type": 1 00:16:05.706 }, 00:16:05.706 { 00:16:05.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.706 "dma_device_type": 2 00:16:05.706 } 00:16:05.706 ], 00:16:05.706 "driver_specific": { 00:16:05.706 "passthru": { 00:16:05.706 "name": "pt2", 00:16:05.706 "base_bdev_name": "malloc2" 00:16:05.706 } 00:16:05.706 } 00:16:05.706 }' 00:16:05.706 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.966 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.227 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.227 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.227 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.227 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:06.227 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.486 "name": "pt3", 00:16:06.486 "aliases": [ 00:16:06.486 "00000000-0000-0000-0000-000000000003" 00:16:06.486 ], 00:16:06.486 "product_name": "passthru", 00:16:06.486 "block_size": 512, 00:16:06.486 "num_blocks": 65536, 00:16:06.486 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:06.486 "assigned_rate_limits": { 00:16:06.486 "rw_ios_per_sec": 0, 00:16:06.486 "rw_mbytes_per_sec": 0, 00:16:06.486 "r_mbytes_per_sec": 0, 00:16:06.486 "w_mbytes_per_sec": 0 00:16:06.486 }, 00:16:06.486 "claimed": true, 00:16:06.486 "claim_type": "exclusive_write", 00:16:06.486 "zoned": false, 00:16:06.486 "supported_io_types": { 00:16:06.486 "read": true, 00:16:06.486 "write": true, 00:16:06.486 "unmap": true, 00:16:06.486 "write_zeroes": true, 00:16:06.486 "flush": true, 00:16:06.486 "reset": true, 00:16:06.486 "compare": false, 00:16:06.486 "compare_and_write": false, 00:16:06.486 "abort": true, 00:16:06.486 "nvme_admin": false, 00:16:06.486 "nvme_io": false 00:16:06.486 }, 00:16:06.486 "memory_domains": [ 00:16:06.486 { 00:16:06.486 "dma_device_id": "system", 00:16:06.486 "dma_device_type": 1 00:16:06.486 }, 00:16:06.486 { 00:16:06.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.486 "dma_device_type": 2 00:16:06.486 } 00:16:06.486 ], 00:16:06.486 "driver_specific": { 00:16:06.486 "passthru": { 00:16:06.486 "name": "pt3", 00:16:06.486 "base_bdev_name": "malloc3" 00:16:06.486 } 00:16:06.486 } 00:16:06.486 }' 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.486 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.746 13:44:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.746 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.746 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.746 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.746 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.746 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:06.746 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.006 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.006 "name": "pt4", 00:16:07.006 "aliases": [ 00:16:07.006 "00000000-0000-0000-0000-000000000004" 00:16:07.006 ], 00:16:07.006 "product_name": "passthru", 00:16:07.006 "block_size": 512, 00:16:07.006 "num_blocks": 65536, 00:16:07.006 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:07.006 "assigned_rate_limits": { 00:16:07.006 "rw_ios_per_sec": 0, 00:16:07.006 "rw_mbytes_per_sec": 0, 00:16:07.006 "r_mbytes_per_sec": 0, 00:16:07.006 "w_mbytes_per_sec": 0 00:16:07.006 }, 00:16:07.006 "claimed": true, 00:16:07.006 "claim_type": "exclusive_write", 00:16:07.006 "zoned": false, 00:16:07.006 "supported_io_types": { 00:16:07.006 "read": true, 00:16:07.006 "write": true, 00:16:07.006 "unmap": true, 00:16:07.006 "write_zeroes": true, 00:16:07.006 "flush": true, 00:16:07.006 "reset": true, 00:16:07.006 "compare": false, 00:16:07.006 "compare_and_write": false, 00:16:07.006 "abort": true, 00:16:07.006 "nvme_admin": false, 00:16:07.006 "nvme_io": false 00:16:07.006 }, 00:16:07.006 "memory_domains": [ 00:16:07.006 { 00:16:07.006 "dma_device_id": "system", 00:16:07.006 "dma_device_type": 1 00:16:07.006 }, 00:16:07.006 { 00:16:07.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.006 "dma_device_type": 2 00:16:07.006 } 00:16:07.006 ], 00:16:07.006 "driver_specific": { 00:16:07.006 "passthru": { 00:16:07.006 "name": "pt4", 00:16:07.006 "base_bdev_name": "malloc4" 00:16:07.006 } 00:16:07.006 } 00:16:07.006 }' 00:16:07.006 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.007 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.007 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.007 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.007 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.007 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.007 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:07.267 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:07.528 [2024-06-10 13:44:21.814211] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:07.528 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=87a622ae-79c7-4610-8868-443c98bae0ba 00:16:07.528 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 87a622ae-79c7-4610-8868-443c98bae0ba ']' 00:16:07.528 13:44:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:07.789 [2024-06-10 13:44:22.018482] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:07.789 [2024-06-10 13:44:22.018496] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:07.789 [2024-06-10 13:44:22.018536] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:07.789 [2024-06-10 13:44:22.018592] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:07.789 [2024-06-10 13:44:22.018598] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103bb60 name raid_bdev1, state offline 00:16:07.789 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.789 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:07.789 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:07.789 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:07.789 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:07.789 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:08.049 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:08.049 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:08.309 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:08.309 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:08.569 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:08.569 13:44:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:08.569 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:08.569 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:08.829 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:09.089 [2024-06-10 13:44:23.434014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:09.089 [2024-06-10 13:44:23.435169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:09.089 [2024-06-10 13:44:23.435214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:09.089 [2024-06-10 13:44:23.435243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:09.089 [2024-06-10 13:44:23.435280] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:09.089 [2024-06-10 13:44:23.435309] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:09.089 [2024-06-10 13:44:23.435324] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:09.089 [2024-06-10 13:44:23.435339] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:09.089 [2024-06-10 13:44:23.435349] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:09.089 [2024-06-10 13:44:23.435355] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1042f10 name raid_bdev1, state configuring 00:16:09.089 request: 00:16:09.089 { 00:16:09.089 "name": "raid_bdev1", 00:16:09.089 "raid_level": "raid0", 00:16:09.089 "base_bdevs": [ 00:16:09.089 "malloc1", 00:16:09.089 "malloc2", 00:16:09.089 "malloc3", 00:16:09.089 "malloc4" 00:16:09.089 ], 00:16:09.089 "superblock": false, 00:16:09.089 "strip_size_kb": 64, 00:16:09.089 "method": "bdev_raid_create", 00:16:09.089 "req_id": 1 00:16:09.089 } 00:16:09.089 Got JSON-RPC error response 00:16:09.089 response: 00:16:09.089 { 00:16:09.089 "code": -17, 00:16:09.089 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:09.089 } 00:16:09.089 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:16:09.089 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:16:09.089 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:16:09.089 13:44:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:16:09.089 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.089 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:09.349 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:09.349 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:09.349 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:09.610 [2024-06-10 13:44:23.842994] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:09.610 [2024-06-10 13:44:23.843028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:09.610 [2024-06-10 13:44:23.843041] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10457c0 00:16:09.610 [2024-06-10 13:44:23.843049] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:09.610 [2024-06-10 13:44:23.844424] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:09.610 [2024-06-10 13:44:23.844446] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:09.610 [2024-06-10 13:44:23.844501] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:09.610 [2024-06-10 13:44:23.844519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:09.610 pt1 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.610 13:44:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:09.610 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.610 "name": "raid_bdev1", 00:16:09.610 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:09.610 "strip_size_kb": 64, 00:16:09.610 "state": "configuring", 00:16:09.610 "raid_level": "raid0", 00:16:09.610 "superblock": true, 00:16:09.610 "num_base_bdevs": 4, 00:16:09.610 "num_base_bdevs_discovered": 1, 00:16:09.610 "num_base_bdevs_operational": 4, 00:16:09.610 "base_bdevs_list": [ 00:16:09.610 { 00:16:09.610 "name": "pt1", 00:16:09.610 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:09.610 "is_configured": true, 00:16:09.610 "data_offset": 2048, 00:16:09.610 "data_size": 63488 00:16:09.610 }, 00:16:09.610 { 00:16:09.610 "name": null, 00:16:09.610 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:09.610 "is_configured": false, 00:16:09.610 "data_offset": 2048, 00:16:09.610 "data_size": 63488 00:16:09.610 }, 00:16:09.610 { 00:16:09.610 "name": null, 00:16:09.610 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:09.610 "is_configured": false, 00:16:09.610 "data_offset": 2048, 00:16:09.610 "data_size": 63488 00:16:09.610 }, 00:16:09.610 { 00:16:09.610 "name": null, 00:16:09.610 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:09.610 "is_configured": false, 00:16:09.610 "data_offset": 2048, 00:16:09.610 "data_size": 63488 00:16:09.610 } 00:16:09.610 ] 00:16:09.610 }' 00:16:09.610 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.610 13:44:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.179 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:16:10.179 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:10.439 [2024-06-10 13:44:24.777375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:10.439 [2024-06-10 13:44:24.777409] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:10.439 [2024-06-10 13:44:24.777427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103b5c0 00:16:10.439 [2024-06-10 13:44:24.777435] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:10.439 [2024-06-10 13:44:24.777715] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:10.439 [2024-06-10 13:44:24.777728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:10.439 [2024-06-10 13:44:24.777775] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:10.439 [2024-06-10 13:44:24.777788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:10.439 pt2 00:16:10.439 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:10.699 [2024-06-10 13:44:24.981897] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.699 13:44:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.699 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.699 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.699 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.699 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:10.959 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.959 "name": "raid_bdev1", 00:16:10.959 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:10.959 "strip_size_kb": 64, 00:16:10.959 "state": "configuring", 00:16:10.959 "raid_level": "raid0", 00:16:10.959 "superblock": true, 00:16:10.959 "num_base_bdevs": 4, 00:16:10.959 "num_base_bdevs_discovered": 1, 00:16:10.959 "num_base_bdevs_operational": 4, 00:16:10.959 "base_bdevs_list": [ 00:16:10.959 { 00:16:10.959 "name": "pt1", 00:16:10.959 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:10.959 "is_configured": true, 00:16:10.959 "data_offset": 2048, 00:16:10.959 "data_size": 63488 00:16:10.959 }, 00:16:10.959 { 00:16:10.959 "name": null, 00:16:10.959 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:10.959 "is_configured": false, 00:16:10.959 "data_offset": 2048, 00:16:10.959 "data_size": 63488 00:16:10.959 }, 00:16:10.959 { 00:16:10.959 "name": null, 00:16:10.959 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:10.959 "is_configured": false, 00:16:10.959 "data_offset": 2048, 00:16:10.959 "data_size": 63488 00:16:10.959 }, 00:16:10.959 { 00:16:10.959 "name": null, 00:16:10.959 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:10.959 "is_configured": false, 00:16:10.959 "data_offset": 2048, 00:16:10.959 "data_size": 63488 00:16:10.959 } 00:16:10.959 ] 00:16:10.959 }' 00:16:10.959 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.959 13:44:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.529 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:11.529 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:11.529 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:11.529 [2024-06-10 13:44:25.968401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:11.529 [2024-06-10 13:44:25.968436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:11.529 [2024-06-10 13:44:25.968455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1042a60 00:16:11.529 [2024-06-10 13:44:25.968462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:11.529 [2024-06-10 13:44:25.968742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:11.529 [2024-06-10 13:44:25.968755] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:11.529 [2024-06-10 13:44:25.968801] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:11.529 [2024-06-10 13:44:25.968814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:11.529 pt2 00:16:11.529 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:11.529 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:11.529 13:44:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:11.789 [2024-06-10 13:44:26.168906] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:11.789 [2024-06-10 13:44:26.168933] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:11.789 [2024-06-10 13:44:26.168943] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103ceb0 00:16:11.789 [2024-06-10 13:44:26.168950] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:11.789 [2024-06-10 13:44:26.169217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:11.789 [2024-06-10 13:44:26.169230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:11.789 [2024-06-10 13:44:26.169271] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:11.789 [2024-06-10 13:44:26.169282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:11.789 pt3 00:16:11.789 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:11.789 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:11.789 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:12.049 [2024-06-10 13:44:26.369419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:12.049 [2024-06-10 13:44:26.369444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:12.049 [2024-06-10 13:44:26.369455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1038ea0 00:16:12.049 [2024-06-10 13:44:26.369462] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:12.049 [2024-06-10 13:44:26.369704] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:12.049 [2024-06-10 13:44:26.369715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:12.049 [2024-06-10 13:44:26.369751] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:16:12.049 [2024-06-10 13:44:26.369762] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:12.049 [2024-06-10 13:44:26.369860] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1039ac0 00:16:12.049 [2024-06-10 13:44:26.369867] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:12.049 [2024-06-10 13:44:26.370012] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110e920 00:16:12.049 [2024-06-10 13:44:26.370118] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1039ac0 00:16:12.049 [2024-06-10 13:44:26.370123] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1039ac0 00:16:12.049 [2024-06-10 13:44:26.370208] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:12.049 pt4 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.049 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:12.310 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.310 "name": "raid_bdev1", 00:16:12.310 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:12.310 "strip_size_kb": 64, 00:16:12.310 "state": "online", 00:16:12.310 "raid_level": "raid0", 00:16:12.310 "superblock": true, 00:16:12.310 "num_base_bdevs": 4, 00:16:12.310 "num_base_bdevs_discovered": 4, 00:16:12.310 "num_base_bdevs_operational": 4, 00:16:12.310 "base_bdevs_list": [ 00:16:12.310 { 00:16:12.310 "name": "pt1", 00:16:12.310 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:12.310 "is_configured": true, 00:16:12.310 "data_offset": 2048, 00:16:12.310 "data_size": 63488 00:16:12.310 }, 00:16:12.310 { 00:16:12.310 "name": "pt2", 00:16:12.310 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:12.310 "is_configured": true, 00:16:12.310 "data_offset": 2048, 00:16:12.310 "data_size": 63488 00:16:12.310 }, 00:16:12.310 { 00:16:12.310 "name": "pt3", 00:16:12.310 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:12.310 "is_configured": true, 00:16:12.310 "data_offset": 2048, 00:16:12.310 "data_size": 63488 00:16:12.310 }, 00:16:12.310 { 00:16:12.310 "name": "pt4", 00:16:12.310 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:12.310 "is_configured": true, 00:16:12.310 "data_offset": 2048, 00:16:12.310 "data_size": 63488 00:16:12.310 } 00:16:12.310 ] 00:16:12.310 }' 00:16:12.310 13:44:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.310 13:44:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:12.881 [2024-06-10 13:44:27.312058] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:12.881 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:12.881 "name": "raid_bdev1", 00:16:12.881 "aliases": [ 00:16:12.881 "87a622ae-79c7-4610-8868-443c98bae0ba" 00:16:12.881 ], 00:16:12.881 "product_name": "Raid Volume", 00:16:12.881 "block_size": 512, 00:16:12.881 "num_blocks": 253952, 00:16:12.881 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:12.881 "assigned_rate_limits": { 00:16:12.881 "rw_ios_per_sec": 0, 00:16:12.881 "rw_mbytes_per_sec": 0, 00:16:12.881 "r_mbytes_per_sec": 0, 00:16:12.881 "w_mbytes_per_sec": 0 00:16:12.881 }, 00:16:12.881 "claimed": false, 00:16:12.881 "zoned": false, 00:16:12.881 "supported_io_types": { 00:16:12.881 "read": true, 00:16:12.881 "write": true, 00:16:12.881 "unmap": true, 00:16:12.881 "write_zeroes": true, 00:16:12.881 "flush": true, 00:16:12.881 "reset": true, 00:16:12.881 "compare": false, 00:16:12.881 "compare_and_write": false, 00:16:12.881 "abort": false, 00:16:12.881 "nvme_admin": false, 00:16:12.881 "nvme_io": false 00:16:12.881 }, 00:16:12.881 "memory_domains": [ 00:16:12.881 { 00:16:12.881 "dma_device_id": "system", 00:16:12.881 "dma_device_type": 1 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.881 "dma_device_type": 2 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "system", 00:16:12.881 "dma_device_type": 1 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.881 "dma_device_type": 2 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "system", 00:16:12.881 "dma_device_type": 1 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.881 "dma_device_type": 2 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "system", 00:16:12.881 "dma_device_type": 1 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.881 "dma_device_type": 2 00:16:12.881 } 00:16:12.881 ], 00:16:12.881 "driver_specific": { 00:16:12.881 "raid": { 00:16:12.881 "uuid": "87a622ae-79c7-4610-8868-443c98bae0ba", 00:16:12.881 "strip_size_kb": 64, 00:16:12.881 "state": "online", 00:16:12.881 "raid_level": "raid0", 00:16:12.881 "superblock": true, 00:16:12.881 "num_base_bdevs": 4, 00:16:12.881 "num_base_bdevs_discovered": 4, 00:16:12.881 "num_base_bdevs_operational": 4, 00:16:12.881 "base_bdevs_list": [ 00:16:12.881 { 00:16:12.881 "name": "pt1", 00:16:12.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:12.881 "is_configured": true, 00:16:12.881 "data_offset": 2048, 00:16:12.881 "data_size": 63488 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "name": "pt2", 00:16:12.881 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:12.881 "is_configured": true, 00:16:12.881 "data_offset": 2048, 00:16:12.881 "data_size": 63488 00:16:12.881 }, 00:16:12.881 { 00:16:12.881 "name": "pt3", 00:16:12.881 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:12.881 "is_configured": true, 00:16:12.881 "data_offset": 2048, 00:16:12.882 "data_size": 63488 00:16:12.882 }, 00:16:12.882 { 00:16:12.882 "name": "pt4", 00:16:12.882 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:12.882 "is_configured": true, 00:16:12.882 "data_offset": 2048, 00:16:12.882 "data_size": 63488 00:16:12.882 } 00:16:12.882 ] 00:16:12.882 } 00:16:12.882 } 00:16:12.882 }' 00:16:12.882 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:13.142 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:13.142 pt2 00:16:13.142 pt3 00:16:13.142 pt4' 00:16:13.142 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:13.142 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:13.142 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:13.142 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:13.142 "name": "pt1", 00:16:13.142 "aliases": [ 00:16:13.142 "00000000-0000-0000-0000-000000000001" 00:16:13.142 ], 00:16:13.142 "product_name": "passthru", 00:16:13.142 "block_size": 512, 00:16:13.142 "num_blocks": 65536, 00:16:13.142 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:13.142 "assigned_rate_limits": { 00:16:13.142 "rw_ios_per_sec": 0, 00:16:13.142 "rw_mbytes_per_sec": 0, 00:16:13.142 "r_mbytes_per_sec": 0, 00:16:13.142 "w_mbytes_per_sec": 0 00:16:13.142 }, 00:16:13.142 "claimed": true, 00:16:13.142 "claim_type": "exclusive_write", 00:16:13.142 "zoned": false, 00:16:13.142 "supported_io_types": { 00:16:13.142 "read": true, 00:16:13.142 "write": true, 00:16:13.142 "unmap": true, 00:16:13.142 "write_zeroes": true, 00:16:13.142 "flush": true, 00:16:13.142 "reset": true, 00:16:13.142 "compare": false, 00:16:13.142 "compare_and_write": false, 00:16:13.142 "abort": true, 00:16:13.142 "nvme_admin": false, 00:16:13.142 "nvme_io": false 00:16:13.142 }, 00:16:13.142 "memory_domains": [ 00:16:13.142 { 00:16:13.142 "dma_device_id": "system", 00:16:13.142 "dma_device_type": 1 00:16:13.142 }, 00:16:13.142 { 00:16:13.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.142 "dma_device_type": 2 00:16:13.142 } 00:16:13.142 ], 00:16:13.142 "driver_specific": { 00:16:13.142 "passthru": { 00:16:13.142 "name": "pt1", 00:16:13.142 "base_bdev_name": "malloc1" 00:16:13.142 } 00:16:13.142 } 00:16:13.142 }' 00:16:13.142 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:13.401 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:13.661 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:13.661 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:13.661 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:13.661 13:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:13.661 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:13.661 "name": "pt2", 00:16:13.661 "aliases": [ 00:16:13.661 "00000000-0000-0000-0000-000000000002" 00:16:13.661 ], 00:16:13.661 "product_name": "passthru", 00:16:13.661 "block_size": 512, 00:16:13.661 "num_blocks": 65536, 00:16:13.661 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:13.661 "assigned_rate_limits": { 00:16:13.661 "rw_ios_per_sec": 0, 00:16:13.661 "rw_mbytes_per_sec": 0, 00:16:13.661 "r_mbytes_per_sec": 0, 00:16:13.661 "w_mbytes_per_sec": 0 00:16:13.661 }, 00:16:13.661 "claimed": true, 00:16:13.661 "claim_type": "exclusive_write", 00:16:13.661 "zoned": false, 00:16:13.661 "supported_io_types": { 00:16:13.661 "read": true, 00:16:13.661 "write": true, 00:16:13.661 "unmap": true, 00:16:13.661 "write_zeroes": true, 00:16:13.661 "flush": true, 00:16:13.661 "reset": true, 00:16:13.661 "compare": false, 00:16:13.661 "compare_and_write": false, 00:16:13.661 "abort": true, 00:16:13.661 "nvme_admin": false, 00:16:13.661 "nvme_io": false 00:16:13.661 }, 00:16:13.661 "memory_domains": [ 00:16:13.661 { 00:16:13.661 "dma_device_id": "system", 00:16:13.661 "dma_device_type": 1 00:16:13.661 }, 00:16:13.661 { 00:16:13.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.661 "dma_device_type": 2 00:16:13.661 } 00:16:13.661 ], 00:16:13.661 "driver_specific": { 00:16:13.661 "passthru": { 00:16:13.661 "name": "pt2", 00:16:13.661 "base_bdev_name": "malloc2" 00:16:13.661 } 00:16:13.661 } 00:16:13.661 }' 00:16:13.661 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:13.920 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:13.920 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:13.920 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:13.920 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:13.920 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:13.921 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:13.921 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:13.921 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.921 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:13.921 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.180 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:14.180 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:14.180 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:14.180 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:14.180 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:14.180 "name": "pt3", 00:16:14.180 "aliases": [ 00:16:14.180 "00000000-0000-0000-0000-000000000003" 00:16:14.180 ], 00:16:14.180 "product_name": "passthru", 00:16:14.180 "block_size": 512, 00:16:14.180 "num_blocks": 65536, 00:16:14.180 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:14.180 "assigned_rate_limits": { 00:16:14.180 "rw_ios_per_sec": 0, 00:16:14.180 "rw_mbytes_per_sec": 0, 00:16:14.180 "r_mbytes_per_sec": 0, 00:16:14.180 "w_mbytes_per_sec": 0 00:16:14.180 }, 00:16:14.180 "claimed": true, 00:16:14.180 "claim_type": "exclusive_write", 00:16:14.180 "zoned": false, 00:16:14.180 "supported_io_types": { 00:16:14.180 "read": true, 00:16:14.180 "write": true, 00:16:14.180 "unmap": true, 00:16:14.180 "write_zeroes": true, 00:16:14.180 "flush": true, 00:16:14.180 "reset": true, 00:16:14.180 "compare": false, 00:16:14.180 "compare_and_write": false, 00:16:14.180 "abort": true, 00:16:14.180 "nvme_admin": false, 00:16:14.180 "nvme_io": false 00:16:14.180 }, 00:16:14.180 "memory_domains": [ 00:16:14.180 { 00:16:14.180 "dma_device_id": "system", 00:16:14.180 "dma_device_type": 1 00:16:14.180 }, 00:16:14.180 { 00:16:14.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.180 "dma_device_type": 2 00:16:14.180 } 00:16:14.180 ], 00:16:14.180 "driver_specific": { 00:16:14.180 "passthru": { 00:16:14.180 "name": "pt3", 00:16:14.180 "base_bdev_name": "malloc3" 00:16:14.180 } 00:16:14.180 } 00:16:14.180 }' 00:16:14.180 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:14.439 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.699 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.699 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:14.699 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:14.699 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:14.699 13:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:14.958 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:14.958 "name": "pt4", 00:16:14.958 "aliases": [ 00:16:14.958 "00000000-0000-0000-0000-000000000004" 00:16:14.958 ], 00:16:14.958 "product_name": "passthru", 00:16:14.958 "block_size": 512, 00:16:14.958 "num_blocks": 65536, 00:16:14.958 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:14.958 "assigned_rate_limits": { 00:16:14.958 "rw_ios_per_sec": 0, 00:16:14.959 "rw_mbytes_per_sec": 0, 00:16:14.959 "r_mbytes_per_sec": 0, 00:16:14.959 "w_mbytes_per_sec": 0 00:16:14.959 }, 00:16:14.959 "claimed": true, 00:16:14.959 "claim_type": "exclusive_write", 00:16:14.959 "zoned": false, 00:16:14.959 "supported_io_types": { 00:16:14.959 "read": true, 00:16:14.959 "write": true, 00:16:14.959 "unmap": true, 00:16:14.959 "write_zeroes": true, 00:16:14.959 "flush": true, 00:16:14.959 "reset": true, 00:16:14.959 "compare": false, 00:16:14.959 "compare_and_write": false, 00:16:14.959 "abort": true, 00:16:14.959 "nvme_admin": false, 00:16:14.959 "nvme_io": false 00:16:14.959 }, 00:16:14.959 "memory_domains": [ 00:16:14.959 { 00:16:14.959 "dma_device_id": "system", 00:16:14.959 "dma_device_type": 1 00:16:14.959 }, 00:16:14.959 { 00:16:14.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.959 "dma_device_type": 2 00:16:14.959 } 00:16:14.959 ], 00:16:14.959 "driver_specific": { 00:16:14.959 "passthru": { 00:16:14.959 "name": "pt4", 00:16:14.959 "base_bdev_name": "malloc4" 00:16:14.959 } 00:16:14.959 } 00:16:14.959 }' 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.959 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.219 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.219 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.219 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.219 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.219 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:15.219 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:15.479 [2024-06-10 13:44:29.726196] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 87a622ae-79c7-4610-8868-443c98bae0ba '!=' 87a622ae-79c7-4610-8868-443c98bae0ba ']' 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1575845 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1575845 ']' 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1575845 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1575845 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1575845' 00:16:15.479 killing process with pid 1575845 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1575845 00:16:15.479 [2024-06-10 13:44:29.804762] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:15.479 [2024-06-10 13:44:29.804814] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:15.479 [2024-06-10 13:44:29.804864] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:15.479 [2024-06-10 13:44:29.804870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1039ac0 name raid_bdev1, state offline 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1575845 00:16:15.479 [2024-06-10 13:44:29.826585] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:15.479 00:16:15.479 real 0m14.216s 00:16:15.479 user 0m26.216s 00:16:15.479 sys 0m2.058s 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:15.479 13:44:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.479 ************************************ 00:16:15.479 END TEST raid_superblock_test 00:16:15.479 ************************************ 00:16:15.740 13:44:29 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:16:15.740 13:44:29 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:15.740 13:44:29 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:15.740 13:44:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:15.740 ************************************ 00:16:15.740 START TEST raid_read_error_test 00:16:15.740 ************************************ 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 read 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:15.740 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3CUd3f4cvH 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1578822 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1578822 /var/tmp/spdk-raid.sock 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1578822 ']' 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:15.741 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:15.741 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.741 [2024-06-10 13:44:30.101074] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:16:15.741 [2024-06-10 13:44:30.101134] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1578822 ] 00:16:15.741 [2024-06-10 13:44:30.192719] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:16.002 [2024-06-10 13:44:30.258630] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:16:16.002 [2024-06-10 13:44:30.301485] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:16.002 [2024-06-10 13:44:30.301510] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:16.574 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:16.574 13:44:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:16:16.574 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:16.574 13:44:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:16.835 BaseBdev1_malloc 00:16:16.835 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:16.835 true 00:16:16.835 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:17.095 [2024-06-10 13:44:31.344552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:17.095 [2024-06-10 13:44:31.344585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.095 [2024-06-10 13:44:31.344597] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde9c90 00:16:17.095 [2024-06-10 13:44:31.344604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.095 [2024-06-10 13:44:31.346027] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.095 [2024-06-10 13:44:31.346048] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:17.095 BaseBdev1 00:16:17.095 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:17.095 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:17.095 BaseBdev2_malloc 00:16:17.095 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:17.356 true 00:16:17.356 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:17.356 [2024-06-10 13:44:31.779649] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:17.356 [2024-06-10 13:44:31.779677] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.356 [2024-06-10 13:44:31.779688] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdee400 00:16:17.356 [2024-06-10 13:44:31.779694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.356 [2024-06-10 13:44:31.780936] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.356 [2024-06-10 13:44:31.780956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:17.356 BaseBdev2 00:16:17.356 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:17.356 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:17.617 BaseBdev3_malloc 00:16:17.617 13:44:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:17.877 true 00:16:17.877 13:44:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:17.877 [2024-06-10 13:44:32.335054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:17.877 [2024-06-10 13:44:32.335082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.877 [2024-06-10 13:44:32.335095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf0fc0 00:16:17.877 [2024-06-10 13:44:32.335102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.877 [2024-06-10 13:44:32.336342] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.877 [2024-06-10 13:44:32.336362] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:17.877 BaseBdev3 00:16:18.137 13:44:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:18.137 13:44:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:18.137 BaseBdev4_malloc 00:16:18.137 13:44:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:18.397 true 00:16:18.397 13:44:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:18.658 [2024-06-10 13:44:32.894472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:18.658 [2024-06-10 13:44:32.894501] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.658 [2024-06-10 13:44:32.894513] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf1710 00:16:18.658 [2024-06-10 13:44:32.894520] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.658 [2024-06-10 13:44:32.895756] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.658 [2024-06-10 13:44:32.895775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:18.658 BaseBdev4 00:16:18.658 13:44:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:18.658 [2024-06-10 13:44:33.086994] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:18.658 [2024-06-10 13:44:33.088047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:18.658 [2024-06-10 13:44:33.088102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:18.658 [2024-06-10 13:44:33.088152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:18.658 [2024-06-10 13:44:33.088345] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdeb3b0 00:16:18.658 [2024-06-10 13:44:33.088354] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:18.658 [2024-06-10 13:44:33.088503] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xddfa90 00:16:18.658 [2024-06-10 13:44:33.088626] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdeb3b0 00:16:18.658 [2024-06-10 13:44:33.088632] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdeb3b0 00:16:18.658 [2024-06-10 13:44:33.088710] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.658 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:18.918 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.918 "name": "raid_bdev1", 00:16:18.918 "uuid": "3f1ad632-0978-41b2-9665-7258224c1d88", 00:16:18.918 "strip_size_kb": 64, 00:16:18.918 "state": "online", 00:16:18.918 "raid_level": "raid0", 00:16:18.918 "superblock": true, 00:16:18.918 "num_base_bdevs": 4, 00:16:18.918 "num_base_bdevs_discovered": 4, 00:16:18.918 "num_base_bdevs_operational": 4, 00:16:18.918 "base_bdevs_list": [ 00:16:18.918 { 00:16:18.918 "name": "BaseBdev1", 00:16:18.918 "uuid": "85c71495-774d-5f00-a944-afa1bdada47d", 00:16:18.918 "is_configured": true, 00:16:18.918 "data_offset": 2048, 00:16:18.918 "data_size": 63488 00:16:18.918 }, 00:16:18.918 { 00:16:18.918 "name": "BaseBdev2", 00:16:18.918 "uuid": "867b9d6f-c01d-5b34-8d4e-842e42f71cc5", 00:16:18.918 "is_configured": true, 00:16:18.918 "data_offset": 2048, 00:16:18.918 "data_size": 63488 00:16:18.919 }, 00:16:18.919 { 00:16:18.919 "name": "BaseBdev3", 00:16:18.919 "uuid": "900acdc5-7872-5eec-976a-a3856f02e07a", 00:16:18.919 "is_configured": true, 00:16:18.919 "data_offset": 2048, 00:16:18.919 "data_size": 63488 00:16:18.919 }, 00:16:18.919 { 00:16:18.919 "name": "BaseBdev4", 00:16:18.919 "uuid": "23ff28c6-2240-54fa-989f-3994e1398942", 00:16:18.919 "is_configured": true, 00:16:18.919 "data_offset": 2048, 00:16:18.919 "data_size": 63488 00:16:18.919 } 00:16:18.919 ] 00:16:18.919 }' 00:16:18.919 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.919 13:44:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.489 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:19.489 13:44:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:19.489 [2024-06-10 13:44:33.917293] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc40200 00:16:20.430 13:44:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.691 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:20.952 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.952 "name": "raid_bdev1", 00:16:20.952 "uuid": "3f1ad632-0978-41b2-9665-7258224c1d88", 00:16:20.952 "strip_size_kb": 64, 00:16:20.952 "state": "online", 00:16:20.952 "raid_level": "raid0", 00:16:20.952 "superblock": true, 00:16:20.952 "num_base_bdevs": 4, 00:16:20.952 "num_base_bdevs_discovered": 4, 00:16:20.952 "num_base_bdevs_operational": 4, 00:16:20.952 "base_bdevs_list": [ 00:16:20.952 { 00:16:20.952 "name": "BaseBdev1", 00:16:20.952 "uuid": "85c71495-774d-5f00-a944-afa1bdada47d", 00:16:20.952 "is_configured": true, 00:16:20.952 "data_offset": 2048, 00:16:20.952 "data_size": 63488 00:16:20.952 }, 00:16:20.952 { 00:16:20.952 "name": "BaseBdev2", 00:16:20.952 "uuid": "867b9d6f-c01d-5b34-8d4e-842e42f71cc5", 00:16:20.952 "is_configured": true, 00:16:20.952 "data_offset": 2048, 00:16:20.952 "data_size": 63488 00:16:20.952 }, 00:16:20.952 { 00:16:20.952 "name": "BaseBdev3", 00:16:20.952 "uuid": "900acdc5-7872-5eec-976a-a3856f02e07a", 00:16:20.952 "is_configured": true, 00:16:20.953 "data_offset": 2048, 00:16:20.953 "data_size": 63488 00:16:20.953 }, 00:16:20.953 { 00:16:20.953 "name": "BaseBdev4", 00:16:20.953 "uuid": "23ff28c6-2240-54fa-989f-3994e1398942", 00:16:20.953 "is_configured": true, 00:16:20.953 "data_offset": 2048, 00:16:20.953 "data_size": 63488 00:16:20.953 } 00:16:20.953 ] 00:16:20.953 }' 00:16:20.953 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.953 13:44:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.523 13:44:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:21.786 [2024-06-10 13:44:36.001972] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:21.786 [2024-06-10 13:44:36.001998] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:21.786 [2024-06-10 13:44:36.004847] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:21.786 [2024-06-10 13:44:36.004882] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:21.786 [2024-06-10 13:44:36.004912] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:21.786 [2024-06-10 13:44:36.004918] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdeb3b0 name raid_bdev1, state offline 00:16:21.786 0 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1578822 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1578822 ']' 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1578822 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1578822 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1578822' 00:16:21.786 killing process with pid 1578822 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1578822 00:16:21.786 [2024-06-10 13:44:36.085433] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1578822 00:16:21.786 [2024-06-10 13:44:36.103058] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3CUd3f4cvH 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.48 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.48 != \0\.\0\0 ]] 00:16:21.786 00:16:21.786 real 0m6.213s 00:16:21.786 user 0m9.949s 00:16:21.786 sys 0m0.870s 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:21.786 13:44:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.786 ************************************ 00:16:21.786 END TEST raid_read_error_test 00:16:21.786 ************************************ 00:16:22.072 13:44:36 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:16:22.072 13:44:36 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:22.072 13:44:36 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:22.072 13:44:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:22.072 ************************************ 00:16:22.072 START TEST raid_write_error_test 00:16:22.072 ************************************ 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 write 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.SuRjOJiGhW 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1580232 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1580232 /var/tmp/spdk-raid.sock 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1580232 ']' 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:22.072 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:22.072 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.072 [2024-06-10 13:44:36.370713] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:16:22.072 [2024-06-10 13:44:36.370761] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1580232 ] 00:16:22.072 [2024-06-10 13:44:36.457455] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:22.072 [2024-06-10 13:44:36.522034] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:16:22.354 [2024-06-10 13:44:36.568407] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:22.354 [2024-06-10 13:44:36.568432] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:22.354 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:22.354 13:44:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:16:22.354 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:22.354 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:22.614 BaseBdev1_malloc 00:16:22.614 13:44:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:22.614 true 00:16:22.614 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:22.874 [2024-06-10 13:44:37.186276] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:22.874 [2024-06-10 13:44:37.186307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.874 [2024-06-10 13:44:37.186320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x223bc90 00:16:22.874 [2024-06-10 13:44:37.186327] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.874 [2024-06-10 13:44:37.187748] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.874 [2024-06-10 13:44:37.187769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:22.874 BaseBdev1 00:16:22.874 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:22.874 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:23.134 BaseBdev2_malloc 00:16:23.134 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:23.134 true 00:16:23.134 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:23.394 [2024-06-10 13:44:37.793870] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:23.394 [2024-06-10 13:44:37.793899] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.394 [2024-06-10 13:44:37.793910] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2240400 00:16:23.394 [2024-06-10 13:44:37.793917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.394 [2024-06-10 13:44:37.795184] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.394 [2024-06-10 13:44:37.795204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:23.394 BaseBdev2 00:16:23.394 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:23.394 13:44:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:23.654 BaseBdev3_malloc 00:16:23.654 13:44:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:23.914 true 00:16:23.914 13:44:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:23.914 [2024-06-10 13:44:38.385392] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:23.914 [2024-06-10 13:44:38.385418] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.914 [2024-06-10 13:44:38.385431] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2242fc0 00:16:23.914 [2024-06-10 13:44:38.385437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.914 [2024-06-10 13:44:38.386692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.914 [2024-06-10 13:44:38.386711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:24.175 BaseBdev3 00:16:24.175 13:44:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:24.175 13:44:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:24.175 BaseBdev4_malloc 00:16:24.175 13:44:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:24.434 true 00:16:24.434 13:44:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:24.694 [2024-06-10 13:44:38.988936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:24.694 [2024-06-10 13:44:38.988965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:24.694 [2024-06-10 13:44:38.988980] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2243710 00:16:24.694 [2024-06-10 13:44:38.988987] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:24.694 [2024-06-10 13:44:38.990240] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:24.694 [2024-06-10 13:44:38.990259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:24.694 BaseBdev4 00:16:24.694 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:24.953 [2024-06-10 13:44:39.193481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:24.953 [2024-06-10 13:44:39.194559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:24.953 [2024-06-10 13:44:39.194615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:24.953 [2024-06-10 13:44:39.194665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:24.953 [2024-06-10 13:44:39.194852] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x223d3b0 00:16:24.953 [2024-06-10 13:44:39.194860] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:24.953 [2024-06-10 13:44:39.195013] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2231a90 00:16:24.953 [2024-06-10 13:44:39.195133] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x223d3b0 00:16:24.953 [2024-06-10 13:44:39.195139] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x223d3b0 00:16:24.953 [2024-06-10 13:44:39.195226] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.953 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.953 "name": "raid_bdev1", 00:16:24.953 "uuid": "949a7c04-b397-49f9-82cd-d8087fb437e0", 00:16:24.953 "strip_size_kb": 64, 00:16:24.953 "state": "online", 00:16:24.953 "raid_level": "raid0", 00:16:24.953 "superblock": true, 00:16:24.953 "num_base_bdevs": 4, 00:16:24.954 "num_base_bdevs_discovered": 4, 00:16:24.954 "num_base_bdevs_operational": 4, 00:16:24.954 "base_bdevs_list": [ 00:16:24.954 { 00:16:24.954 "name": "BaseBdev1", 00:16:24.954 "uuid": "341c3a9c-8b65-5f6f-8b58-9838f0f8327a", 00:16:24.954 "is_configured": true, 00:16:24.954 "data_offset": 2048, 00:16:24.954 "data_size": 63488 00:16:24.954 }, 00:16:24.954 { 00:16:24.954 "name": "BaseBdev2", 00:16:24.954 "uuid": "87174458-abe2-5080-8776-81980c5491f4", 00:16:24.954 "is_configured": true, 00:16:24.954 "data_offset": 2048, 00:16:24.954 "data_size": 63488 00:16:24.954 }, 00:16:24.954 { 00:16:24.954 "name": "BaseBdev3", 00:16:24.954 "uuid": "341cc155-a094-5e68-99c5-00c8c6538dbb", 00:16:24.954 "is_configured": true, 00:16:24.954 "data_offset": 2048, 00:16:24.954 "data_size": 63488 00:16:24.954 }, 00:16:24.954 { 00:16:24.954 "name": "BaseBdev4", 00:16:24.954 "uuid": "eedc3c33-89fa-58d6-90ef-5cd2807ad51f", 00:16:24.954 "is_configured": true, 00:16:24.954 "data_offset": 2048, 00:16:24.954 "data_size": 63488 00:16:24.954 } 00:16:24.954 ] 00:16:24.954 }' 00:16:24.954 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.954 13:44:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.523 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:25.523 13:44:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:25.784 [2024-06-10 13:44:40.007747] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2092200 00:16:26.727 13:44:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.727 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:26.988 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.988 "name": "raid_bdev1", 00:16:26.988 "uuid": "949a7c04-b397-49f9-82cd-d8087fb437e0", 00:16:26.988 "strip_size_kb": 64, 00:16:26.988 "state": "online", 00:16:26.988 "raid_level": "raid0", 00:16:26.988 "superblock": true, 00:16:26.988 "num_base_bdevs": 4, 00:16:26.988 "num_base_bdevs_discovered": 4, 00:16:26.988 "num_base_bdevs_operational": 4, 00:16:26.988 "base_bdevs_list": [ 00:16:26.988 { 00:16:26.988 "name": "BaseBdev1", 00:16:26.988 "uuid": "341c3a9c-8b65-5f6f-8b58-9838f0f8327a", 00:16:26.988 "is_configured": true, 00:16:26.988 "data_offset": 2048, 00:16:26.988 "data_size": 63488 00:16:26.988 }, 00:16:26.988 { 00:16:26.988 "name": "BaseBdev2", 00:16:26.988 "uuid": "87174458-abe2-5080-8776-81980c5491f4", 00:16:26.988 "is_configured": true, 00:16:26.988 "data_offset": 2048, 00:16:26.988 "data_size": 63488 00:16:26.988 }, 00:16:26.988 { 00:16:26.988 "name": "BaseBdev3", 00:16:26.988 "uuid": "341cc155-a094-5e68-99c5-00c8c6538dbb", 00:16:26.988 "is_configured": true, 00:16:26.988 "data_offset": 2048, 00:16:26.988 "data_size": 63488 00:16:26.988 }, 00:16:26.988 { 00:16:26.988 "name": "BaseBdev4", 00:16:26.988 "uuid": "eedc3c33-89fa-58d6-90ef-5cd2807ad51f", 00:16:26.988 "is_configured": true, 00:16:26.988 "data_offset": 2048, 00:16:26.988 "data_size": 63488 00:16:26.988 } 00:16:26.988 ] 00:16:26.988 }' 00:16:26.988 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.988 13:44:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.558 13:44:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:27.819 [2024-06-10 13:44:42.058662] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:27.819 [2024-06-10 13:44:42.058688] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:27.819 [2024-06-10 13:44:42.061482] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:27.819 [2024-06-10 13:44:42.061515] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:27.819 [2024-06-10 13:44:42.061544] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:27.819 [2024-06-10 13:44:42.061550] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x223d3b0 name raid_bdev1, state offline 00:16:27.819 0 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1580232 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1580232 ']' 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1580232 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1580232 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1580232' 00:16:27.819 killing process with pid 1580232 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1580232 00:16:27.819 [2024-06-10 13:44:42.112982] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1580232 00:16:27.819 [2024-06-10 13:44:42.130570] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.SuRjOJiGhW 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:16:27.819 00:16:27.819 real 0m5.949s 00:16:27.819 user 0m9.860s 00:16:27.819 sys 0m0.858s 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:27.819 13:44:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.819 ************************************ 00:16:27.819 END TEST raid_write_error_test 00:16:27.819 ************************************ 00:16:28.084 13:44:42 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:28.084 13:44:42 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:16:28.084 13:44:42 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:28.084 13:44:42 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:28.084 13:44:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:28.084 ************************************ 00:16:28.084 START TEST raid_state_function_test 00:16:28.084 ************************************ 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 false 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1581353 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1581353' 00:16:28.084 Process raid pid: 1581353 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1581353 /var/tmp/spdk-raid.sock 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1581353 ']' 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:28.084 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:28.084 13:44:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.084 [2024-06-10 13:44:42.409445] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:16:28.084 [2024-06-10 13:44:42.409493] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:28.084 [2024-06-10 13:44:42.480591] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:28.084 [2024-06-10 13:44:42.545523] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:16:28.344 [2024-06-10 13:44:42.588524] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.344 [2024-06-10 13:44:42.588547] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.914 13:44:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:28.914 13:44:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:16:28.914 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:29.174 [2024-06-10 13:44:43.448422] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:29.174 [2024-06-10 13:44:43.448452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:29.174 [2024-06-10 13:44:43.448458] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:29.174 [2024-06-10 13:44:43.448465] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:29.174 [2024-06-10 13:44:43.448470] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:29.174 [2024-06-10 13:44:43.448475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.174 [2024-06-10 13:44:43.448480] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:29.174 [2024-06-10 13:44:43.448486] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.174 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.433 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.433 "name": "Existed_Raid", 00:16:29.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.433 "strip_size_kb": 64, 00:16:29.433 "state": "configuring", 00:16:29.433 "raid_level": "concat", 00:16:29.433 "superblock": false, 00:16:29.433 "num_base_bdevs": 4, 00:16:29.433 "num_base_bdevs_discovered": 0, 00:16:29.433 "num_base_bdevs_operational": 4, 00:16:29.433 "base_bdevs_list": [ 00:16:29.433 { 00:16:29.433 "name": "BaseBdev1", 00:16:29.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.433 "is_configured": false, 00:16:29.433 "data_offset": 0, 00:16:29.433 "data_size": 0 00:16:29.433 }, 00:16:29.433 { 00:16:29.433 "name": "BaseBdev2", 00:16:29.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.433 "is_configured": false, 00:16:29.433 "data_offset": 0, 00:16:29.433 "data_size": 0 00:16:29.433 }, 00:16:29.433 { 00:16:29.433 "name": "BaseBdev3", 00:16:29.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.433 "is_configured": false, 00:16:29.433 "data_offset": 0, 00:16:29.433 "data_size": 0 00:16:29.433 }, 00:16:29.433 { 00:16:29.433 "name": "BaseBdev4", 00:16:29.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.433 "is_configured": false, 00:16:29.433 "data_offset": 0, 00:16:29.433 "data_size": 0 00:16:29.433 } 00:16:29.433 ] 00:16:29.433 }' 00:16:29.433 13:44:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.433 13:44:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.003 13:44:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:30.003 [2024-06-10 13:44:44.402718] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:30.003 [2024-06-10 13:44:44.402738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195d890 name Existed_Raid, state configuring 00:16:30.003 13:44:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:30.263 [2024-06-10 13:44:44.603245] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:30.263 [2024-06-10 13:44:44.603263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:30.263 [2024-06-10 13:44:44.603268] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:30.263 [2024-06-10 13:44:44.603274] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:30.263 [2024-06-10 13:44:44.603279] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:30.263 [2024-06-10 13:44:44.603285] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:30.263 [2024-06-10 13:44:44.603290] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:30.263 [2024-06-10 13:44:44.603296] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:30.264 13:44:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:30.524 [2024-06-10 13:44:44.810602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:30.524 BaseBdev1 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:30.524 13:44:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:30.784 [ 00:16:30.784 { 00:16:30.784 "name": "BaseBdev1", 00:16:30.784 "aliases": [ 00:16:30.784 "c5321453-e644-49aa-929b-2e3be5130025" 00:16:30.784 ], 00:16:30.784 "product_name": "Malloc disk", 00:16:30.784 "block_size": 512, 00:16:30.784 "num_blocks": 65536, 00:16:30.784 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:30.784 "assigned_rate_limits": { 00:16:30.784 "rw_ios_per_sec": 0, 00:16:30.784 "rw_mbytes_per_sec": 0, 00:16:30.784 "r_mbytes_per_sec": 0, 00:16:30.784 "w_mbytes_per_sec": 0 00:16:30.784 }, 00:16:30.784 "claimed": true, 00:16:30.784 "claim_type": "exclusive_write", 00:16:30.784 "zoned": false, 00:16:30.784 "supported_io_types": { 00:16:30.784 "read": true, 00:16:30.784 "write": true, 00:16:30.784 "unmap": true, 00:16:30.784 "write_zeroes": true, 00:16:30.784 "flush": true, 00:16:30.784 "reset": true, 00:16:30.784 "compare": false, 00:16:30.784 "compare_and_write": false, 00:16:30.784 "abort": true, 00:16:30.784 "nvme_admin": false, 00:16:30.784 "nvme_io": false 00:16:30.784 }, 00:16:30.784 "memory_domains": [ 00:16:30.784 { 00:16:30.784 "dma_device_id": "system", 00:16:30.784 "dma_device_type": 1 00:16:30.784 }, 00:16:30.784 { 00:16:30.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.784 "dma_device_type": 2 00:16:30.784 } 00:16:30.784 ], 00:16:30.784 "driver_specific": {} 00:16:30.784 } 00:16:30.784 ] 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.784 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.044 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.044 "name": "Existed_Raid", 00:16:31.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.044 "strip_size_kb": 64, 00:16:31.044 "state": "configuring", 00:16:31.044 "raid_level": "concat", 00:16:31.044 "superblock": false, 00:16:31.044 "num_base_bdevs": 4, 00:16:31.044 "num_base_bdevs_discovered": 1, 00:16:31.044 "num_base_bdevs_operational": 4, 00:16:31.044 "base_bdevs_list": [ 00:16:31.044 { 00:16:31.044 "name": "BaseBdev1", 00:16:31.044 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:31.044 "is_configured": true, 00:16:31.044 "data_offset": 0, 00:16:31.044 "data_size": 65536 00:16:31.044 }, 00:16:31.044 { 00:16:31.044 "name": "BaseBdev2", 00:16:31.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.044 "is_configured": false, 00:16:31.044 "data_offset": 0, 00:16:31.044 "data_size": 0 00:16:31.044 }, 00:16:31.044 { 00:16:31.044 "name": "BaseBdev3", 00:16:31.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.044 "is_configured": false, 00:16:31.044 "data_offset": 0, 00:16:31.044 "data_size": 0 00:16:31.044 }, 00:16:31.044 { 00:16:31.044 "name": "BaseBdev4", 00:16:31.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.044 "is_configured": false, 00:16:31.044 "data_offset": 0, 00:16:31.044 "data_size": 0 00:16:31.044 } 00:16:31.044 ] 00:16:31.044 }' 00:16:31.044 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.044 13:44:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.614 13:44:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.614 [2024-06-10 13:44:46.077802] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.614 [2024-06-10 13:44:46.077827] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195d100 name Existed_Raid, state configuring 00:16:31.873 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:31.873 [2024-06-10 13:44:46.278344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.873 [2024-06-10 13:44:46.279623] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:31.873 [2024-06-10 13:44:46.279649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:31.873 [2024-06-10 13:44:46.279655] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:31.873 [2024-06-10 13:44:46.279661] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:31.874 [2024-06-10 13:44:46.279666] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:31.874 [2024-06-10 13:44:46.279672] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.874 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.133 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.133 "name": "Existed_Raid", 00:16:32.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.133 "strip_size_kb": 64, 00:16:32.133 "state": "configuring", 00:16:32.133 "raid_level": "concat", 00:16:32.133 "superblock": false, 00:16:32.133 "num_base_bdevs": 4, 00:16:32.133 "num_base_bdevs_discovered": 1, 00:16:32.133 "num_base_bdevs_operational": 4, 00:16:32.133 "base_bdevs_list": [ 00:16:32.133 { 00:16:32.133 "name": "BaseBdev1", 00:16:32.133 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:32.133 "is_configured": true, 00:16:32.133 "data_offset": 0, 00:16:32.133 "data_size": 65536 00:16:32.133 }, 00:16:32.133 { 00:16:32.133 "name": "BaseBdev2", 00:16:32.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.133 "is_configured": false, 00:16:32.133 "data_offset": 0, 00:16:32.133 "data_size": 0 00:16:32.134 }, 00:16:32.134 { 00:16:32.134 "name": "BaseBdev3", 00:16:32.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.134 "is_configured": false, 00:16:32.134 "data_offset": 0, 00:16:32.134 "data_size": 0 00:16:32.134 }, 00:16:32.134 { 00:16:32.134 "name": "BaseBdev4", 00:16:32.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.134 "is_configured": false, 00:16:32.134 "data_offset": 0, 00:16:32.134 "data_size": 0 00:16:32.134 } 00:16:32.134 ] 00:16:32.134 }' 00:16:32.134 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.134 13:44:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.703 13:44:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:32.963 [2024-06-10 13:44:47.181745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:32.963 BaseBdev2 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.963 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:33.223 [ 00:16:33.223 { 00:16:33.223 "name": "BaseBdev2", 00:16:33.223 "aliases": [ 00:16:33.223 "b03d047b-3381-47bd-b3ac-95fb00931a3e" 00:16:33.223 ], 00:16:33.223 "product_name": "Malloc disk", 00:16:33.223 "block_size": 512, 00:16:33.223 "num_blocks": 65536, 00:16:33.223 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:33.223 "assigned_rate_limits": { 00:16:33.223 "rw_ios_per_sec": 0, 00:16:33.223 "rw_mbytes_per_sec": 0, 00:16:33.223 "r_mbytes_per_sec": 0, 00:16:33.223 "w_mbytes_per_sec": 0 00:16:33.223 }, 00:16:33.223 "claimed": true, 00:16:33.223 "claim_type": "exclusive_write", 00:16:33.223 "zoned": false, 00:16:33.223 "supported_io_types": { 00:16:33.223 "read": true, 00:16:33.223 "write": true, 00:16:33.223 "unmap": true, 00:16:33.223 "write_zeroes": true, 00:16:33.223 "flush": true, 00:16:33.223 "reset": true, 00:16:33.223 "compare": false, 00:16:33.223 "compare_and_write": false, 00:16:33.223 "abort": true, 00:16:33.223 "nvme_admin": false, 00:16:33.223 "nvme_io": false 00:16:33.223 }, 00:16:33.223 "memory_domains": [ 00:16:33.223 { 00:16:33.223 "dma_device_id": "system", 00:16:33.223 "dma_device_type": 1 00:16:33.223 }, 00:16:33.223 { 00:16:33.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.223 "dma_device_type": 2 00:16:33.223 } 00:16:33.223 ], 00:16:33.223 "driver_specific": {} 00:16:33.223 } 00:16:33.223 ] 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.223 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.483 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.483 "name": "Existed_Raid", 00:16:33.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.483 "strip_size_kb": 64, 00:16:33.483 "state": "configuring", 00:16:33.483 "raid_level": "concat", 00:16:33.483 "superblock": false, 00:16:33.483 "num_base_bdevs": 4, 00:16:33.483 "num_base_bdevs_discovered": 2, 00:16:33.483 "num_base_bdevs_operational": 4, 00:16:33.483 "base_bdevs_list": [ 00:16:33.483 { 00:16:33.483 "name": "BaseBdev1", 00:16:33.483 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:33.483 "is_configured": true, 00:16:33.483 "data_offset": 0, 00:16:33.483 "data_size": 65536 00:16:33.483 }, 00:16:33.483 { 00:16:33.483 "name": "BaseBdev2", 00:16:33.483 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:33.483 "is_configured": true, 00:16:33.483 "data_offset": 0, 00:16:33.483 "data_size": 65536 00:16:33.483 }, 00:16:33.483 { 00:16:33.483 "name": "BaseBdev3", 00:16:33.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.483 "is_configured": false, 00:16:33.483 "data_offset": 0, 00:16:33.483 "data_size": 0 00:16:33.483 }, 00:16:33.483 { 00:16:33.483 "name": "BaseBdev4", 00:16:33.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.483 "is_configured": false, 00:16:33.483 "data_offset": 0, 00:16:33.483 "data_size": 0 00:16:33.483 } 00:16:33.483 ] 00:16:33.483 }' 00:16:33.483 13:44:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.483 13:44:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:34.052 [2024-06-10 13:44:48.482112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:34.052 BaseBdev3 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:34.052 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.312 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:34.571 [ 00:16:34.571 { 00:16:34.571 "name": "BaseBdev3", 00:16:34.571 "aliases": [ 00:16:34.571 "59cf591b-b613-478e-a6f5-b9ed7d8a7223" 00:16:34.571 ], 00:16:34.571 "product_name": "Malloc disk", 00:16:34.571 "block_size": 512, 00:16:34.571 "num_blocks": 65536, 00:16:34.571 "uuid": "59cf591b-b613-478e-a6f5-b9ed7d8a7223", 00:16:34.571 "assigned_rate_limits": { 00:16:34.571 "rw_ios_per_sec": 0, 00:16:34.571 "rw_mbytes_per_sec": 0, 00:16:34.571 "r_mbytes_per_sec": 0, 00:16:34.571 "w_mbytes_per_sec": 0 00:16:34.571 }, 00:16:34.571 "claimed": true, 00:16:34.571 "claim_type": "exclusive_write", 00:16:34.571 "zoned": false, 00:16:34.571 "supported_io_types": { 00:16:34.571 "read": true, 00:16:34.571 "write": true, 00:16:34.571 "unmap": true, 00:16:34.571 "write_zeroes": true, 00:16:34.571 "flush": true, 00:16:34.571 "reset": true, 00:16:34.571 "compare": false, 00:16:34.571 "compare_and_write": false, 00:16:34.571 "abort": true, 00:16:34.571 "nvme_admin": false, 00:16:34.571 "nvme_io": false 00:16:34.571 }, 00:16:34.571 "memory_domains": [ 00:16:34.571 { 00:16:34.571 "dma_device_id": "system", 00:16:34.571 "dma_device_type": 1 00:16:34.571 }, 00:16:34.571 { 00:16:34.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.571 "dma_device_type": 2 00:16:34.571 } 00:16:34.571 ], 00:16:34.571 "driver_specific": {} 00:16:34.571 } 00:16:34.571 ] 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.571 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.572 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.572 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.572 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.572 13:44:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.832 13:44:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.832 "name": "Existed_Raid", 00:16:34.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.832 "strip_size_kb": 64, 00:16:34.832 "state": "configuring", 00:16:34.832 "raid_level": "concat", 00:16:34.832 "superblock": false, 00:16:34.832 "num_base_bdevs": 4, 00:16:34.832 "num_base_bdevs_discovered": 3, 00:16:34.832 "num_base_bdevs_operational": 4, 00:16:34.832 "base_bdevs_list": [ 00:16:34.832 { 00:16:34.832 "name": "BaseBdev1", 00:16:34.832 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:34.832 "is_configured": true, 00:16:34.832 "data_offset": 0, 00:16:34.832 "data_size": 65536 00:16:34.832 }, 00:16:34.832 { 00:16:34.832 "name": "BaseBdev2", 00:16:34.832 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:34.832 "is_configured": true, 00:16:34.832 "data_offset": 0, 00:16:34.832 "data_size": 65536 00:16:34.832 }, 00:16:34.832 { 00:16:34.832 "name": "BaseBdev3", 00:16:34.832 "uuid": "59cf591b-b613-478e-a6f5-b9ed7d8a7223", 00:16:34.832 "is_configured": true, 00:16:34.832 "data_offset": 0, 00:16:34.832 "data_size": 65536 00:16:34.832 }, 00:16:34.832 { 00:16:34.832 "name": "BaseBdev4", 00:16:34.832 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.832 "is_configured": false, 00:16:34.832 "data_offset": 0, 00:16:34.832 "data_size": 0 00:16:34.832 } 00:16:34.832 ] 00:16:34.832 }' 00:16:34.832 13:44:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.832 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.401 13:44:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:35.401 [2024-06-10 13:44:49.822664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:35.401 [2024-06-10 13:44:49.822687] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195e160 00:16:35.401 [2024-06-10 13:44:49.822691] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:35.401 [2024-06-10 13:44:49.822884] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1949f20 00:16:35.401 [2024-06-10 13:44:49.822982] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195e160 00:16:35.401 [2024-06-10 13:44:49.822987] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x195e160 00:16:35.401 [2024-06-10 13:44:49.823115] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:35.401 BaseBdev4 00:16:35.401 13:44:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:35.401 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:35.401 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:35.401 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:35.401 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:35.402 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:35.402 13:44:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:35.661 13:44:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:35.921 [ 00:16:35.921 { 00:16:35.921 "name": "BaseBdev4", 00:16:35.921 "aliases": [ 00:16:35.921 "ecb4622e-99ce-4fef-ab66-0baab405da5a" 00:16:35.921 ], 00:16:35.921 "product_name": "Malloc disk", 00:16:35.921 "block_size": 512, 00:16:35.921 "num_blocks": 65536, 00:16:35.921 "uuid": "ecb4622e-99ce-4fef-ab66-0baab405da5a", 00:16:35.921 "assigned_rate_limits": { 00:16:35.921 "rw_ios_per_sec": 0, 00:16:35.921 "rw_mbytes_per_sec": 0, 00:16:35.921 "r_mbytes_per_sec": 0, 00:16:35.921 "w_mbytes_per_sec": 0 00:16:35.921 }, 00:16:35.921 "claimed": true, 00:16:35.921 "claim_type": "exclusive_write", 00:16:35.921 "zoned": false, 00:16:35.921 "supported_io_types": { 00:16:35.921 "read": true, 00:16:35.921 "write": true, 00:16:35.921 "unmap": true, 00:16:35.921 "write_zeroes": true, 00:16:35.921 "flush": true, 00:16:35.921 "reset": true, 00:16:35.921 "compare": false, 00:16:35.921 "compare_and_write": false, 00:16:35.921 "abort": true, 00:16:35.921 "nvme_admin": false, 00:16:35.921 "nvme_io": false 00:16:35.921 }, 00:16:35.921 "memory_domains": [ 00:16:35.921 { 00:16:35.921 "dma_device_id": "system", 00:16:35.921 "dma_device_type": 1 00:16:35.921 }, 00:16:35.921 { 00:16:35.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.921 "dma_device_type": 2 00:16:35.921 } 00:16:35.921 ], 00:16:35.921 "driver_specific": {} 00:16:35.921 } 00:16:35.921 ] 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.921 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.922 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.922 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.181 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.181 "name": "Existed_Raid", 00:16:36.181 "uuid": "4ec47798-e2dd-4c76-808b-a567265cc57b", 00:16:36.181 "strip_size_kb": 64, 00:16:36.181 "state": "online", 00:16:36.181 "raid_level": "concat", 00:16:36.181 "superblock": false, 00:16:36.181 "num_base_bdevs": 4, 00:16:36.181 "num_base_bdevs_discovered": 4, 00:16:36.181 "num_base_bdevs_operational": 4, 00:16:36.181 "base_bdevs_list": [ 00:16:36.181 { 00:16:36.181 "name": "BaseBdev1", 00:16:36.181 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:36.181 "is_configured": true, 00:16:36.181 "data_offset": 0, 00:16:36.181 "data_size": 65536 00:16:36.181 }, 00:16:36.181 { 00:16:36.181 "name": "BaseBdev2", 00:16:36.181 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:36.181 "is_configured": true, 00:16:36.181 "data_offset": 0, 00:16:36.181 "data_size": 65536 00:16:36.181 }, 00:16:36.181 { 00:16:36.181 "name": "BaseBdev3", 00:16:36.181 "uuid": "59cf591b-b613-478e-a6f5-b9ed7d8a7223", 00:16:36.181 "is_configured": true, 00:16:36.181 "data_offset": 0, 00:16:36.181 "data_size": 65536 00:16:36.181 }, 00:16:36.181 { 00:16:36.181 "name": "BaseBdev4", 00:16:36.182 "uuid": "ecb4622e-99ce-4fef-ab66-0baab405da5a", 00:16:36.182 "is_configured": true, 00:16:36.182 "data_offset": 0, 00:16:36.182 "data_size": 65536 00:16:36.182 } 00:16:36.182 ] 00:16:36.182 }' 00:16:36.182 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.182 13:44:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:36.442 13:44:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:36.703 [2024-06-10 13:44:51.074120] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:36.703 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:36.703 "name": "Existed_Raid", 00:16:36.703 "aliases": [ 00:16:36.703 "4ec47798-e2dd-4c76-808b-a567265cc57b" 00:16:36.703 ], 00:16:36.703 "product_name": "Raid Volume", 00:16:36.703 "block_size": 512, 00:16:36.703 "num_blocks": 262144, 00:16:36.703 "uuid": "4ec47798-e2dd-4c76-808b-a567265cc57b", 00:16:36.703 "assigned_rate_limits": { 00:16:36.703 "rw_ios_per_sec": 0, 00:16:36.703 "rw_mbytes_per_sec": 0, 00:16:36.703 "r_mbytes_per_sec": 0, 00:16:36.703 "w_mbytes_per_sec": 0 00:16:36.703 }, 00:16:36.703 "claimed": false, 00:16:36.703 "zoned": false, 00:16:36.703 "supported_io_types": { 00:16:36.703 "read": true, 00:16:36.703 "write": true, 00:16:36.703 "unmap": true, 00:16:36.703 "write_zeroes": true, 00:16:36.703 "flush": true, 00:16:36.703 "reset": true, 00:16:36.703 "compare": false, 00:16:36.703 "compare_and_write": false, 00:16:36.703 "abort": false, 00:16:36.703 "nvme_admin": false, 00:16:36.703 "nvme_io": false 00:16:36.703 }, 00:16:36.703 "memory_domains": [ 00:16:36.703 { 00:16:36.703 "dma_device_id": "system", 00:16:36.703 "dma_device_type": 1 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.703 "dma_device_type": 2 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "system", 00:16:36.703 "dma_device_type": 1 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.703 "dma_device_type": 2 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "system", 00:16:36.703 "dma_device_type": 1 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.703 "dma_device_type": 2 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "system", 00:16:36.703 "dma_device_type": 1 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.703 "dma_device_type": 2 00:16:36.703 } 00:16:36.703 ], 00:16:36.703 "driver_specific": { 00:16:36.703 "raid": { 00:16:36.703 "uuid": "4ec47798-e2dd-4c76-808b-a567265cc57b", 00:16:36.703 "strip_size_kb": 64, 00:16:36.703 "state": "online", 00:16:36.703 "raid_level": "concat", 00:16:36.703 "superblock": false, 00:16:36.703 "num_base_bdevs": 4, 00:16:36.703 "num_base_bdevs_discovered": 4, 00:16:36.703 "num_base_bdevs_operational": 4, 00:16:36.703 "base_bdevs_list": [ 00:16:36.703 { 00:16:36.703 "name": "BaseBdev1", 00:16:36.703 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:36.703 "is_configured": true, 00:16:36.703 "data_offset": 0, 00:16:36.703 "data_size": 65536 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "name": "BaseBdev2", 00:16:36.703 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:36.703 "is_configured": true, 00:16:36.703 "data_offset": 0, 00:16:36.703 "data_size": 65536 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "name": "BaseBdev3", 00:16:36.703 "uuid": "59cf591b-b613-478e-a6f5-b9ed7d8a7223", 00:16:36.703 "is_configured": true, 00:16:36.703 "data_offset": 0, 00:16:36.703 "data_size": 65536 00:16:36.703 }, 00:16:36.703 { 00:16:36.703 "name": "BaseBdev4", 00:16:36.703 "uuid": "ecb4622e-99ce-4fef-ab66-0baab405da5a", 00:16:36.703 "is_configured": true, 00:16:36.703 "data_offset": 0, 00:16:36.703 "data_size": 65536 00:16:36.703 } 00:16:36.703 ] 00:16:36.703 } 00:16:36.703 } 00:16:36.703 }' 00:16:36.703 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:36.703 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:36.703 BaseBdev2 00:16:36.703 BaseBdev3 00:16:36.703 BaseBdev4' 00:16:36.703 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.703 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.703 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:36.963 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.963 "name": "BaseBdev1", 00:16:36.963 "aliases": [ 00:16:36.963 "c5321453-e644-49aa-929b-2e3be5130025" 00:16:36.963 ], 00:16:36.963 "product_name": "Malloc disk", 00:16:36.963 "block_size": 512, 00:16:36.963 "num_blocks": 65536, 00:16:36.963 "uuid": "c5321453-e644-49aa-929b-2e3be5130025", 00:16:36.963 "assigned_rate_limits": { 00:16:36.963 "rw_ios_per_sec": 0, 00:16:36.963 "rw_mbytes_per_sec": 0, 00:16:36.963 "r_mbytes_per_sec": 0, 00:16:36.963 "w_mbytes_per_sec": 0 00:16:36.963 }, 00:16:36.963 "claimed": true, 00:16:36.963 "claim_type": "exclusive_write", 00:16:36.963 "zoned": false, 00:16:36.963 "supported_io_types": { 00:16:36.963 "read": true, 00:16:36.963 "write": true, 00:16:36.963 "unmap": true, 00:16:36.963 "write_zeroes": true, 00:16:36.963 "flush": true, 00:16:36.963 "reset": true, 00:16:36.963 "compare": false, 00:16:36.963 "compare_and_write": false, 00:16:36.963 "abort": true, 00:16:36.963 "nvme_admin": false, 00:16:36.963 "nvme_io": false 00:16:36.963 }, 00:16:36.963 "memory_domains": [ 00:16:36.963 { 00:16:36.963 "dma_device_id": "system", 00:16:36.963 "dma_device_type": 1 00:16:36.963 }, 00:16:36.963 { 00:16:36.963 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.963 "dma_device_type": 2 00:16:36.963 } 00:16:36.963 ], 00:16:36.963 "driver_specific": {} 00:16:36.963 }' 00:16:36.963 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.963 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.963 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.963 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.223 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:37.482 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.482 "name": "BaseBdev2", 00:16:37.482 "aliases": [ 00:16:37.482 "b03d047b-3381-47bd-b3ac-95fb00931a3e" 00:16:37.482 ], 00:16:37.482 "product_name": "Malloc disk", 00:16:37.482 "block_size": 512, 00:16:37.482 "num_blocks": 65536, 00:16:37.483 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:37.483 "assigned_rate_limits": { 00:16:37.483 "rw_ios_per_sec": 0, 00:16:37.483 "rw_mbytes_per_sec": 0, 00:16:37.483 "r_mbytes_per_sec": 0, 00:16:37.483 "w_mbytes_per_sec": 0 00:16:37.483 }, 00:16:37.483 "claimed": true, 00:16:37.483 "claim_type": "exclusive_write", 00:16:37.483 "zoned": false, 00:16:37.483 "supported_io_types": { 00:16:37.483 "read": true, 00:16:37.483 "write": true, 00:16:37.483 "unmap": true, 00:16:37.483 "write_zeroes": true, 00:16:37.483 "flush": true, 00:16:37.483 "reset": true, 00:16:37.483 "compare": false, 00:16:37.483 "compare_and_write": false, 00:16:37.483 "abort": true, 00:16:37.483 "nvme_admin": false, 00:16:37.483 "nvme_io": false 00:16:37.483 }, 00:16:37.483 "memory_domains": [ 00:16:37.483 { 00:16:37.483 "dma_device_id": "system", 00:16:37.483 "dma_device_type": 1 00:16:37.483 }, 00:16:37.483 { 00:16:37.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.483 "dma_device_type": 2 00:16:37.483 } 00:16:37.483 ], 00:16:37.483 "driver_specific": {} 00:16:37.483 }' 00:16:37.483 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.483 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.483 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.483 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.743 13:44:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:37.743 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.002 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.002 "name": "BaseBdev3", 00:16:38.002 "aliases": [ 00:16:38.002 "59cf591b-b613-478e-a6f5-b9ed7d8a7223" 00:16:38.002 ], 00:16:38.002 "product_name": "Malloc disk", 00:16:38.002 "block_size": 512, 00:16:38.002 "num_blocks": 65536, 00:16:38.002 "uuid": "59cf591b-b613-478e-a6f5-b9ed7d8a7223", 00:16:38.002 "assigned_rate_limits": { 00:16:38.002 "rw_ios_per_sec": 0, 00:16:38.002 "rw_mbytes_per_sec": 0, 00:16:38.002 "r_mbytes_per_sec": 0, 00:16:38.002 "w_mbytes_per_sec": 0 00:16:38.002 }, 00:16:38.002 "claimed": true, 00:16:38.002 "claim_type": "exclusive_write", 00:16:38.002 "zoned": false, 00:16:38.002 "supported_io_types": { 00:16:38.002 "read": true, 00:16:38.002 "write": true, 00:16:38.002 "unmap": true, 00:16:38.002 "write_zeroes": true, 00:16:38.002 "flush": true, 00:16:38.002 "reset": true, 00:16:38.002 "compare": false, 00:16:38.002 "compare_and_write": false, 00:16:38.002 "abort": true, 00:16:38.002 "nvme_admin": false, 00:16:38.002 "nvme_io": false 00:16:38.002 }, 00:16:38.002 "memory_domains": [ 00:16:38.002 { 00:16:38.002 "dma_device_id": "system", 00:16:38.002 "dma_device_type": 1 00:16:38.002 }, 00:16:38.002 { 00:16:38.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.002 "dma_device_type": 2 00:16:38.002 } 00:16:38.002 ], 00:16:38.002 "driver_specific": {} 00:16:38.002 }' 00:16:38.002 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.002 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.002 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.002 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:38.263 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:38.522 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:38.523 "name": "BaseBdev4", 00:16:38.523 "aliases": [ 00:16:38.523 "ecb4622e-99ce-4fef-ab66-0baab405da5a" 00:16:38.523 ], 00:16:38.523 "product_name": "Malloc disk", 00:16:38.523 "block_size": 512, 00:16:38.523 "num_blocks": 65536, 00:16:38.523 "uuid": "ecb4622e-99ce-4fef-ab66-0baab405da5a", 00:16:38.523 "assigned_rate_limits": { 00:16:38.523 "rw_ios_per_sec": 0, 00:16:38.523 "rw_mbytes_per_sec": 0, 00:16:38.523 "r_mbytes_per_sec": 0, 00:16:38.523 "w_mbytes_per_sec": 0 00:16:38.523 }, 00:16:38.523 "claimed": true, 00:16:38.523 "claim_type": "exclusive_write", 00:16:38.523 "zoned": false, 00:16:38.523 "supported_io_types": { 00:16:38.523 "read": true, 00:16:38.523 "write": true, 00:16:38.523 "unmap": true, 00:16:38.523 "write_zeroes": true, 00:16:38.523 "flush": true, 00:16:38.523 "reset": true, 00:16:38.523 "compare": false, 00:16:38.523 "compare_and_write": false, 00:16:38.523 "abort": true, 00:16:38.523 "nvme_admin": false, 00:16:38.523 "nvme_io": false 00:16:38.523 }, 00:16:38.523 "memory_domains": [ 00:16:38.523 { 00:16:38.523 "dma_device_id": "system", 00:16:38.523 "dma_device_type": 1 00:16:38.523 }, 00:16:38.523 { 00:16:38.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.523 "dma_device_type": 2 00:16:38.523 } 00:16:38.523 ], 00:16:38.523 "driver_specific": {} 00:16:38.523 }' 00:16:38.523 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.523 13:44:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:38.782 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:39.042 [2024-06-10 13:44:53.488114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:39.042 [2024-06-10 13:44:53.488134] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:39.042 [2024-06-10 13:44:53.488180] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.042 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.302 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.302 "name": "Existed_Raid", 00:16:39.302 "uuid": "4ec47798-e2dd-4c76-808b-a567265cc57b", 00:16:39.302 "strip_size_kb": 64, 00:16:39.302 "state": "offline", 00:16:39.302 "raid_level": "concat", 00:16:39.302 "superblock": false, 00:16:39.302 "num_base_bdevs": 4, 00:16:39.302 "num_base_bdevs_discovered": 3, 00:16:39.302 "num_base_bdevs_operational": 3, 00:16:39.302 "base_bdevs_list": [ 00:16:39.302 { 00:16:39.302 "name": null, 00:16:39.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.302 "is_configured": false, 00:16:39.302 "data_offset": 0, 00:16:39.302 "data_size": 65536 00:16:39.302 }, 00:16:39.302 { 00:16:39.302 "name": "BaseBdev2", 00:16:39.302 "uuid": "b03d047b-3381-47bd-b3ac-95fb00931a3e", 00:16:39.302 "is_configured": true, 00:16:39.302 "data_offset": 0, 00:16:39.302 "data_size": 65536 00:16:39.302 }, 00:16:39.302 { 00:16:39.302 "name": "BaseBdev3", 00:16:39.302 "uuid": "59cf591b-b613-478e-a6f5-b9ed7d8a7223", 00:16:39.302 "is_configured": true, 00:16:39.302 "data_offset": 0, 00:16:39.302 "data_size": 65536 00:16:39.302 }, 00:16:39.302 { 00:16:39.302 "name": "BaseBdev4", 00:16:39.302 "uuid": "ecb4622e-99ce-4fef-ab66-0baab405da5a", 00:16:39.302 "is_configured": true, 00:16:39.302 "data_offset": 0, 00:16:39.302 "data_size": 65536 00:16:39.302 } 00:16:39.303 ] 00:16:39.303 }' 00:16:39.303 13:44:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.303 13:44:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.872 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:39.872 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:39.872 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:39.872 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.132 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.132 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.132 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:40.392 [2024-06-10 13:44:54.614993] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.392 13:44:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:40.651 [2024-06-10 13:44:55.010106] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:40.651 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:40.651 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:40.651 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.651 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:40.911 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:40.911 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:40.911 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:41.170 [2024-06-10 13:44:55.401189] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:41.170 [2024-06-10 13:44:55.401230] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195e160 name Existed_Raid, state offline 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:41.170 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:41.429 BaseBdev2 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:41.429 13:44:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:41.688 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:41.948 [ 00:16:41.948 { 00:16:41.948 "name": "BaseBdev2", 00:16:41.948 "aliases": [ 00:16:41.948 "fb477de8-dbfe-49e6-9ab3-868284767e0d" 00:16:41.948 ], 00:16:41.948 "product_name": "Malloc disk", 00:16:41.948 "block_size": 512, 00:16:41.948 "num_blocks": 65536, 00:16:41.948 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:41.948 "assigned_rate_limits": { 00:16:41.948 "rw_ios_per_sec": 0, 00:16:41.948 "rw_mbytes_per_sec": 0, 00:16:41.948 "r_mbytes_per_sec": 0, 00:16:41.948 "w_mbytes_per_sec": 0 00:16:41.948 }, 00:16:41.948 "claimed": false, 00:16:41.948 "zoned": false, 00:16:41.948 "supported_io_types": { 00:16:41.948 "read": true, 00:16:41.948 "write": true, 00:16:41.948 "unmap": true, 00:16:41.948 "write_zeroes": true, 00:16:41.948 "flush": true, 00:16:41.948 "reset": true, 00:16:41.948 "compare": false, 00:16:41.948 "compare_and_write": false, 00:16:41.948 "abort": true, 00:16:41.948 "nvme_admin": false, 00:16:41.948 "nvme_io": false 00:16:41.948 }, 00:16:41.948 "memory_domains": [ 00:16:41.948 { 00:16:41.948 "dma_device_id": "system", 00:16:41.948 "dma_device_type": 1 00:16:41.948 }, 00:16:41.948 { 00:16:41.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.948 "dma_device_type": 2 00:16:41.948 } 00:16:41.948 ], 00:16:41.948 "driver_specific": {} 00:16:41.948 } 00:16:41.948 ] 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:41.948 BaseBdev3 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:41.948 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.208 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:42.468 [ 00:16:42.468 { 00:16:42.468 "name": "BaseBdev3", 00:16:42.468 "aliases": [ 00:16:42.468 "8cb52624-8517-46c1-863d-20cfdd31a4e3" 00:16:42.468 ], 00:16:42.468 "product_name": "Malloc disk", 00:16:42.468 "block_size": 512, 00:16:42.468 "num_blocks": 65536, 00:16:42.468 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:42.468 "assigned_rate_limits": { 00:16:42.468 "rw_ios_per_sec": 0, 00:16:42.468 "rw_mbytes_per_sec": 0, 00:16:42.468 "r_mbytes_per_sec": 0, 00:16:42.468 "w_mbytes_per_sec": 0 00:16:42.468 }, 00:16:42.468 "claimed": false, 00:16:42.468 "zoned": false, 00:16:42.468 "supported_io_types": { 00:16:42.468 "read": true, 00:16:42.468 "write": true, 00:16:42.468 "unmap": true, 00:16:42.468 "write_zeroes": true, 00:16:42.468 "flush": true, 00:16:42.468 "reset": true, 00:16:42.468 "compare": false, 00:16:42.468 "compare_and_write": false, 00:16:42.468 "abort": true, 00:16:42.468 "nvme_admin": false, 00:16:42.468 "nvme_io": false 00:16:42.468 }, 00:16:42.468 "memory_domains": [ 00:16:42.468 { 00:16:42.468 "dma_device_id": "system", 00:16:42.468 "dma_device_type": 1 00:16:42.468 }, 00:16:42.468 { 00:16:42.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.468 "dma_device_type": 2 00:16:42.468 } 00:16:42.468 ], 00:16:42.468 "driver_specific": {} 00:16:42.468 } 00:16:42.468 ] 00:16:42.468 13:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:42.468 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.468 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.468 13:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:42.727 BaseBdev4 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:42.727 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.987 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:42.987 [ 00:16:42.987 { 00:16:42.987 "name": "BaseBdev4", 00:16:42.987 "aliases": [ 00:16:42.987 "3c1e1165-f67a-4b69-b2a8-ace1020cda70" 00:16:42.987 ], 00:16:42.987 "product_name": "Malloc disk", 00:16:42.987 "block_size": 512, 00:16:42.987 "num_blocks": 65536, 00:16:42.987 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:42.987 "assigned_rate_limits": { 00:16:42.987 "rw_ios_per_sec": 0, 00:16:42.987 "rw_mbytes_per_sec": 0, 00:16:42.987 "r_mbytes_per_sec": 0, 00:16:42.987 "w_mbytes_per_sec": 0 00:16:42.987 }, 00:16:42.987 "claimed": false, 00:16:42.987 "zoned": false, 00:16:42.987 "supported_io_types": { 00:16:42.987 "read": true, 00:16:42.987 "write": true, 00:16:42.987 "unmap": true, 00:16:42.987 "write_zeroes": true, 00:16:42.987 "flush": true, 00:16:42.987 "reset": true, 00:16:42.987 "compare": false, 00:16:42.987 "compare_and_write": false, 00:16:42.987 "abort": true, 00:16:42.987 "nvme_admin": false, 00:16:42.987 "nvme_io": false 00:16:42.987 }, 00:16:42.987 "memory_domains": [ 00:16:42.987 { 00:16:42.987 "dma_device_id": "system", 00:16:42.987 "dma_device_type": 1 00:16:42.987 }, 00:16:42.987 { 00:16:42.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.987 "dma_device_type": 2 00:16:42.987 } 00:16:42.987 ], 00:16:42.987 "driver_specific": {} 00:16:42.987 } 00:16:42.987 ] 00:16:42.987 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:42.987 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:42.987 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:42.987 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:43.246 [2024-06-10 13:44:57.613322] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:43.246 [2024-06-10 13:44:57.613351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:43.246 [2024-06-10 13:44:57.613366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:43.246 [2024-06-10 13:44:57.614465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:43.246 [2024-06-10 13:44:57.614500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.246 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.506 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.506 "name": "Existed_Raid", 00:16:43.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.506 "strip_size_kb": 64, 00:16:43.506 "state": "configuring", 00:16:43.506 "raid_level": "concat", 00:16:43.506 "superblock": false, 00:16:43.506 "num_base_bdevs": 4, 00:16:43.506 "num_base_bdevs_discovered": 3, 00:16:43.506 "num_base_bdevs_operational": 4, 00:16:43.506 "base_bdevs_list": [ 00:16:43.506 { 00:16:43.506 "name": "BaseBdev1", 00:16:43.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.506 "is_configured": false, 00:16:43.506 "data_offset": 0, 00:16:43.506 "data_size": 0 00:16:43.506 }, 00:16:43.506 { 00:16:43.506 "name": "BaseBdev2", 00:16:43.506 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:43.506 "is_configured": true, 00:16:43.506 "data_offset": 0, 00:16:43.506 "data_size": 65536 00:16:43.506 }, 00:16:43.506 { 00:16:43.506 "name": "BaseBdev3", 00:16:43.506 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:43.506 "is_configured": true, 00:16:43.506 "data_offset": 0, 00:16:43.506 "data_size": 65536 00:16:43.506 }, 00:16:43.506 { 00:16:43.506 "name": "BaseBdev4", 00:16:43.506 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:43.506 "is_configured": true, 00:16:43.506 "data_offset": 0, 00:16:43.506 "data_size": 65536 00:16:43.506 } 00:16:43.506 ] 00:16:43.506 }' 00:16:43.506 13:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.506 13:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.075 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:44.335 [2024-06-10 13:44:58.555701] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.335 "name": "Existed_Raid", 00:16:44.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.335 "strip_size_kb": 64, 00:16:44.335 "state": "configuring", 00:16:44.335 "raid_level": "concat", 00:16:44.335 "superblock": false, 00:16:44.335 "num_base_bdevs": 4, 00:16:44.335 "num_base_bdevs_discovered": 2, 00:16:44.335 "num_base_bdevs_operational": 4, 00:16:44.335 "base_bdevs_list": [ 00:16:44.335 { 00:16:44.335 "name": "BaseBdev1", 00:16:44.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.335 "is_configured": false, 00:16:44.335 "data_offset": 0, 00:16:44.335 "data_size": 0 00:16:44.335 }, 00:16:44.335 { 00:16:44.335 "name": null, 00:16:44.335 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:44.335 "is_configured": false, 00:16:44.335 "data_offset": 0, 00:16:44.335 "data_size": 65536 00:16:44.335 }, 00:16:44.335 { 00:16:44.335 "name": "BaseBdev3", 00:16:44.335 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:44.335 "is_configured": true, 00:16:44.335 "data_offset": 0, 00:16:44.335 "data_size": 65536 00:16:44.335 }, 00:16:44.335 { 00:16:44.335 "name": "BaseBdev4", 00:16:44.335 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:44.335 "is_configured": true, 00:16:44.335 "data_offset": 0, 00:16:44.335 "data_size": 65536 00:16:44.335 } 00:16:44.335 ] 00:16:44.335 }' 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.335 13:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.904 13:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.904 13:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:45.164 13:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:45.164 13:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:45.424 [2024-06-10 13:44:59.723841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.424 BaseBdev1 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:45.424 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.684 13:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:45.684 [ 00:16:45.684 { 00:16:45.684 "name": "BaseBdev1", 00:16:45.684 "aliases": [ 00:16:45.684 "6c986a6a-4129-4cce-94e7-fb11fec0997d" 00:16:45.684 ], 00:16:45.684 "product_name": "Malloc disk", 00:16:45.684 "block_size": 512, 00:16:45.684 "num_blocks": 65536, 00:16:45.684 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:45.684 "assigned_rate_limits": { 00:16:45.684 "rw_ios_per_sec": 0, 00:16:45.684 "rw_mbytes_per_sec": 0, 00:16:45.684 "r_mbytes_per_sec": 0, 00:16:45.684 "w_mbytes_per_sec": 0 00:16:45.684 }, 00:16:45.684 "claimed": true, 00:16:45.684 "claim_type": "exclusive_write", 00:16:45.684 "zoned": false, 00:16:45.684 "supported_io_types": { 00:16:45.684 "read": true, 00:16:45.684 "write": true, 00:16:45.684 "unmap": true, 00:16:45.684 "write_zeroes": true, 00:16:45.684 "flush": true, 00:16:45.684 "reset": true, 00:16:45.684 "compare": false, 00:16:45.684 "compare_and_write": false, 00:16:45.684 "abort": true, 00:16:45.684 "nvme_admin": false, 00:16:45.684 "nvme_io": false 00:16:45.684 }, 00:16:45.684 "memory_domains": [ 00:16:45.684 { 00:16:45.684 "dma_device_id": "system", 00:16:45.684 "dma_device_type": 1 00:16:45.684 }, 00:16:45.684 { 00:16:45.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.684 "dma_device_type": 2 00:16:45.684 } 00:16:45.684 ], 00:16:45.684 "driver_specific": {} 00:16:45.684 } 00:16:45.684 ] 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.685 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.945 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.945 "name": "Existed_Raid", 00:16:45.945 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.945 "strip_size_kb": 64, 00:16:45.945 "state": "configuring", 00:16:45.945 "raid_level": "concat", 00:16:45.945 "superblock": false, 00:16:45.945 "num_base_bdevs": 4, 00:16:45.945 "num_base_bdevs_discovered": 3, 00:16:45.945 "num_base_bdevs_operational": 4, 00:16:45.945 "base_bdevs_list": [ 00:16:45.945 { 00:16:45.945 "name": "BaseBdev1", 00:16:45.945 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:45.945 "is_configured": true, 00:16:45.945 "data_offset": 0, 00:16:45.945 "data_size": 65536 00:16:45.945 }, 00:16:45.945 { 00:16:45.945 "name": null, 00:16:45.945 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:45.945 "is_configured": false, 00:16:45.945 "data_offset": 0, 00:16:45.945 "data_size": 65536 00:16:45.945 }, 00:16:45.945 { 00:16:45.945 "name": "BaseBdev3", 00:16:45.945 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:45.945 "is_configured": true, 00:16:45.945 "data_offset": 0, 00:16:45.945 "data_size": 65536 00:16:45.945 }, 00:16:45.945 { 00:16:45.945 "name": "BaseBdev4", 00:16:45.945 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:45.945 "is_configured": true, 00:16:45.945 "data_offset": 0, 00:16:45.945 "data_size": 65536 00:16:45.945 } 00:16:45.945 ] 00:16:45.945 }' 00:16:45.945 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.945 13:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.514 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.515 13:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:46.774 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:46.774 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:46.774 [2024-06-10 13:45:01.243748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.035 "name": "Existed_Raid", 00:16:47.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.035 "strip_size_kb": 64, 00:16:47.035 "state": "configuring", 00:16:47.035 "raid_level": "concat", 00:16:47.035 "superblock": false, 00:16:47.035 "num_base_bdevs": 4, 00:16:47.035 "num_base_bdevs_discovered": 2, 00:16:47.035 "num_base_bdevs_operational": 4, 00:16:47.035 "base_bdevs_list": [ 00:16:47.035 { 00:16:47.035 "name": "BaseBdev1", 00:16:47.035 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:47.035 "is_configured": true, 00:16:47.035 "data_offset": 0, 00:16:47.035 "data_size": 65536 00:16:47.035 }, 00:16:47.035 { 00:16:47.035 "name": null, 00:16:47.035 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:47.035 "is_configured": false, 00:16:47.035 "data_offset": 0, 00:16:47.035 "data_size": 65536 00:16:47.035 }, 00:16:47.035 { 00:16:47.035 "name": null, 00:16:47.035 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:47.035 "is_configured": false, 00:16:47.035 "data_offset": 0, 00:16:47.035 "data_size": 65536 00:16:47.035 }, 00:16:47.035 { 00:16:47.035 "name": "BaseBdev4", 00:16:47.035 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:47.035 "is_configured": true, 00:16:47.035 "data_offset": 0, 00:16:47.035 "data_size": 65536 00:16:47.035 } 00:16:47.035 ] 00:16:47.035 }' 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.035 13:45:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.605 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.605 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:47.864 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:47.864 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:48.124 [2024-06-10 13:45:02.438813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.124 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.385 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.385 "name": "Existed_Raid", 00:16:48.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.385 "strip_size_kb": 64, 00:16:48.385 "state": "configuring", 00:16:48.385 "raid_level": "concat", 00:16:48.385 "superblock": false, 00:16:48.385 "num_base_bdevs": 4, 00:16:48.385 "num_base_bdevs_discovered": 3, 00:16:48.385 "num_base_bdevs_operational": 4, 00:16:48.385 "base_bdevs_list": [ 00:16:48.385 { 00:16:48.385 "name": "BaseBdev1", 00:16:48.385 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:48.385 "is_configured": true, 00:16:48.385 "data_offset": 0, 00:16:48.385 "data_size": 65536 00:16:48.385 }, 00:16:48.385 { 00:16:48.385 "name": null, 00:16:48.385 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:48.385 "is_configured": false, 00:16:48.385 "data_offset": 0, 00:16:48.385 "data_size": 65536 00:16:48.385 }, 00:16:48.385 { 00:16:48.385 "name": "BaseBdev3", 00:16:48.385 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:48.385 "is_configured": true, 00:16:48.385 "data_offset": 0, 00:16:48.385 "data_size": 65536 00:16:48.385 }, 00:16:48.385 { 00:16:48.385 "name": "BaseBdev4", 00:16:48.385 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:48.385 "is_configured": true, 00:16:48.385 "data_offset": 0, 00:16:48.385 "data_size": 65536 00:16:48.385 } 00:16:48.385 ] 00:16:48.385 }' 00:16:48.385 13:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.385 13:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.956 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.956 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:48.956 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:48.956 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:49.218 [2024-06-10 13:45:03.561666] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.218 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.478 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.478 "name": "Existed_Raid", 00:16:49.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.478 "strip_size_kb": 64, 00:16:49.478 "state": "configuring", 00:16:49.478 "raid_level": "concat", 00:16:49.478 "superblock": false, 00:16:49.478 "num_base_bdevs": 4, 00:16:49.478 "num_base_bdevs_discovered": 2, 00:16:49.478 "num_base_bdevs_operational": 4, 00:16:49.478 "base_bdevs_list": [ 00:16:49.478 { 00:16:49.478 "name": null, 00:16:49.478 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:49.478 "is_configured": false, 00:16:49.478 "data_offset": 0, 00:16:49.478 "data_size": 65536 00:16:49.478 }, 00:16:49.478 { 00:16:49.478 "name": null, 00:16:49.478 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:49.478 "is_configured": false, 00:16:49.478 "data_offset": 0, 00:16:49.478 "data_size": 65536 00:16:49.478 }, 00:16:49.478 { 00:16:49.478 "name": "BaseBdev3", 00:16:49.478 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:49.478 "is_configured": true, 00:16:49.478 "data_offset": 0, 00:16:49.478 "data_size": 65536 00:16:49.478 }, 00:16:49.478 { 00:16:49.478 "name": "BaseBdev4", 00:16:49.478 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:49.478 "is_configured": true, 00:16:49.478 "data_offset": 0, 00:16:49.478 "data_size": 65536 00:16:49.478 } 00:16:49.478 ] 00:16:49.478 }' 00:16:49.478 13:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.478 13:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.049 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.049 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:50.310 [2024-06-10 13:45:04.730771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.310 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.633 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.633 "name": "Existed_Raid", 00:16:50.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.633 "strip_size_kb": 64, 00:16:50.633 "state": "configuring", 00:16:50.633 "raid_level": "concat", 00:16:50.633 "superblock": false, 00:16:50.633 "num_base_bdevs": 4, 00:16:50.633 "num_base_bdevs_discovered": 3, 00:16:50.633 "num_base_bdevs_operational": 4, 00:16:50.633 "base_bdevs_list": [ 00:16:50.633 { 00:16:50.633 "name": null, 00:16:50.633 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:50.633 "is_configured": false, 00:16:50.633 "data_offset": 0, 00:16:50.633 "data_size": 65536 00:16:50.633 }, 00:16:50.633 { 00:16:50.633 "name": "BaseBdev2", 00:16:50.633 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:50.633 "is_configured": true, 00:16:50.633 "data_offset": 0, 00:16:50.633 "data_size": 65536 00:16:50.633 }, 00:16:50.633 { 00:16:50.633 "name": "BaseBdev3", 00:16:50.633 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:50.633 "is_configured": true, 00:16:50.633 "data_offset": 0, 00:16:50.633 "data_size": 65536 00:16:50.633 }, 00:16:50.633 { 00:16:50.633 "name": "BaseBdev4", 00:16:50.633 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:50.633 "is_configured": true, 00:16:50.633 "data_offset": 0, 00:16:50.633 "data_size": 65536 00:16:50.633 } 00:16:50.633 ] 00:16:50.633 }' 00:16:50.633 13:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.633 13:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.230 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.230 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:51.490 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:51.490 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.490 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:51.490 13:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6c986a6a-4129-4cce-94e7-fb11fec0997d 00:16:51.751 [2024-06-10 13:45:06.107428] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:51.751 [2024-06-10 13:45:06.107454] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195c030 00:16:51.751 [2024-06-10 13:45:06.107459] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:51.751 [2024-06-10 13:45:06.107616] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1961ad0 00:16:51.751 [2024-06-10 13:45:06.107710] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195c030 00:16:51.751 [2024-06-10 13:45:06.107716] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x195c030 00:16:51.751 [2024-06-10 13:45:06.107835] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:51.751 NewBaseBdev 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:51.751 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.011 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:52.271 [ 00:16:52.271 { 00:16:52.271 "name": "NewBaseBdev", 00:16:52.271 "aliases": [ 00:16:52.271 "6c986a6a-4129-4cce-94e7-fb11fec0997d" 00:16:52.271 ], 00:16:52.271 "product_name": "Malloc disk", 00:16:52.271 "block_size": 512, 00:16:52.271 "num_blocks": 65536, 00:16:52.271 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:52.271 "assigned_rate_limits": { 00:16:52.271 "rw_ios_per_sec": 0, 00:16:52.271 "rw_mbytes_per_sec": 0, 00:16:52.271 "r_mbytes_per_sec": 0, 00:16:52.271 "w_mbytes_per_sec": 0 00:16:52.271 }, 00:16:52.271 "claimed": true, 00:16:52.271 "claim_type": "exclusive_write", 00:16:52.271 "zoned": false, 00:16:52.271 "supported_io_types": { 00:16:52.271 "read": true, 00:16:52.271 "write": true, 00:16:52.271 "unmap": true, 00:16:52.271 "write_zeroes": true, 00:16:52.271 "flush": true, 00:16:52.271 "reset": true, 00:16:52.271 "compare": false, 00:16:52.271 "compare_and_write": false, 00:16:52.271 "abort": true, 00:16:52.271 "nvme_admin": false, 00:16:52.271 "nvme_io": false 00:16:52.271 }, 00:16:52.271 "memory_domains": [ 00:16:52.271 { 00:16:52.271 "dma_device_id": "system", 00:16:52.271 "dma_device_type": 1 00:16:52.271 }, 00:16:52.271 { 00:16:52.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.271 "dma_device_type": 2 00:16:52.271 } 00:16:52.271 ], 00:16:52.271 "driver_specific": {} 00:16:52.271 } 00:16:52.271 ] 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.271 "name": "Existed_Raid", 00:16:52.271 "uuid": "c9895efb-0e86-41c5-9f64-354431c4e8c2", 00:16:52.271 "strip_size_kb": 64, 00:16:52.271 "state": "online", 00:16:52.271 "raid_level": "concat", 00:16:52.271 "superblock": false, 00:16:52.271 "num_base_bdevs": 4, 00:16:52.271 "num_base_bdevs_discovered": 4, 00:16:52.271 "num_base_bdevs_operational": 4, 00:16:52.271 "base_bdevs_list": [ 00:16:52.271 { 00:16:52.271 "name": "NewBaseBdev", 00:16:52.271 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:52.271 "is_configured": true, 00:16:52.271 "data_offset": 0, 00:16:52.271 "data_size": 65536 00:16:52.271 }, 00:16:52.271 { 00:16:52.271 "name": "BaseBdev2", 00:16:52.271 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:52.271 "is_configured": true, 00:16:52.271 "data_offset": 0, 00:16:52.271 "data_size": 65536 00:16:52.271 }, 00:16:52.271 { 00:16:52.271 "name": "BaseBdev3", 00:16:52.271 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:52.271 "is_configured": true, 00:16:52.271 "data_offset": 0, 00:16:52.271 "data_size": 65536 00:16:52.271 }, 00:16:52.271 { 00:16:52.271 "name": "BaseBdev4", 00:16:52.271 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:52.271 "is_configured": true, 00:16:52.271 "data_offset": 0, 00:16:52.271 "data_size": 65536 00:16:52.271 } 00:16:52.271 ] 00:16:52.271 }' 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.271 13:45:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.840 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:52.840 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:52.840 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:52.840 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:52.840 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:52.841 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:52.841 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:52.841 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:53.100 [2024-06-10 13:45:07.483180] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.100 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:53.100 "name": "Existed_Raid", 00:16:53.100 "aliases": [ 00:16:53.100 "c9895efb-0e86-41c5-9f64-354431c4e8c2" 00:16:53.100 ], 00:16:53.100 "product_name": "Raid Volume", 00:16:53.100 "block_size": 512, 00:16:53.100 "num_blocks": 262144, 00:16:53.100 "uuid": "c9895efb-0e86-41c5-9f64-354431c4e8c2", 00:16:53.100 "assigned_rate_limits": { 00:16:53.100 "rw_ios_per_sec": 0, 00:16:53.100 "rw_mbytes_per_sec": 0, 00:16:53.100 "r_mbytes_per_sec": 0, 00:16:53.100 "w_mbytes_per_sec": 0 00:16:53.100 }, 00:16:53.100 "claimed": false, 00:16:53.100 "zoned": false, 00:16:53.100 "supported_io_types": { 00:16:53.100 "read": true, 00:16:53.100 "write": true, 00:16:53.100 "unmap": true, 00:16:53.100 "write_zeroes": true, 00:16:53.100 "flush": true, 00:16:53.100 "reset": true, 00:16:53.100 "compare": false, 00:16:53.100 "compare_and_write": false, 00:16:53.100 "abort": false, 00:16:53.100 "nvme_admin": false, 00:16:53.100 "nvme_io": false 00:16:53.100 }, 00:16:53.100 "memory_domains": [ 00:16:53.100 { 00:16:53.100 "dma_device_id": "system", 00:16:53.100 "dma_device_type": 1 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.100 "dma_device_type": 2 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "system", 00:16:53.100 "dma_device_type": 1 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.100 "dma_device_type": 2 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "system", 00:16:53.100 "dma_device_type": 1 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.100 "dma_device_type": 2 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "system", 00:16:53.100 "dma_device_type": 1 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.100 "dma_device_type": 2 00:16:53.100 } 00:16:53.100 ], 00:16:53.100 "driver_specific": { 00:16:53.100 "raid": { 00:16:53.100 "uuid": "c9895efb-0e86-41c5-9f64-354431c4e8c2", 00:16:53.100 "strip_size_kb": 64, 00:16:53.100 "state": "online", 00:16:53.100 "raid_level": "concat", 00:16:53.100 "superblock": false, 00:16:53.100 "num_base_bdevs": 4, 00:16:53.100 "num_base_bdevs_discovered": 4, 00:16:53.100 "num_base_bdevs_operational": 4, 00:16:53.100 "base_bdevs_list": [ 00:16:53.100 { 00:16:53.100 "name": "NewBaseBdev", 00:16:53.100 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:53.100 "is_configured": true, 00:16:53.100 "data_offset": 0, 00:16:53.100 "data_size": 65536 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "name": "BaseBdev2", 00:16:53.100 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:53.100 "is_configured": true, 00:16:53.100 "data_offset": 0, 00:16:53.100 "data_size": 65536 00:16:53.100 }, 00:16:53.100 { 00:16:53.100 "name": "BaseBdev3", 00:16:53.100 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:53.100 "is_configured": true, 00:16:53.101 "data_offset": 0, 00:16:53.101 "data_size": 65536 00:16:53.101 }, 00:16:53.101 { 00:16:53.101 "name": "BaseBdev4", 00:16:53.101 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:53.101 "is_configured": true, 00:16:53.101 "data_offset": 0, 00:16:53.101 "data_size": 65536 00:16:53.101 } 00:16:53.101 ] 00:16:53.101 } 00:16:53.101 } 00:16:53.101 }' 00:16:53.101 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:53.101 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:53.101 BaseBdev2 00:16:53.101 BaseBdev3 00:16:53.101 BaseBdev4' 00:16:53.101 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.101 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:53.101 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.360 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.360 "name": "NewBaseBdev", 00:16:53.360 "aliases": [ 00:16:53.360 "6c986a6a-4129-4cce-94e7-fb11fec0997d" 00:16:53.360 ], 00:16:53.360 "product_name": "Malloc disk", 00:16:53.360 "block_size": 512, 00:16:53.360 "num_blocks": 65536, 00:16:53.360 "uuid": "6c986a6a-4129-4cce-94e7-fb11fec0997d", 00:16:53.360 "assigned_rate_limits": { 00:16:53.360 "rw_ios_per_sec": 0, 00:16:53.360 "rw_mbytes_per_sec": 0, 00:16:53.360 "r_mbytes_per_sec": 0, 00:16:53.360 "w_mbytes_per_sec": 0 00:16:53.360 }, 00:16:53.360 "claimed": true, 00:16:53.360 "claim_type": "exclusive_write", 00:16:53.360 "zoned": false, 00:16:53.360 "supported_io_types": { 00:16:53.360 "read": true, 00:16:53.360 "write": true, 00:16:53.360 "unmap": true, 00:16:53.360 "write_zeroes": true, 00:16:53.360 "flush": true, 00:16:53.360 "reset": true, 00:16:53.360 "compare": false, 00:16:53.360 "compare_and_write": false, 00:16:53.360 "abort": true, 00:16:53.360 "nvme_admin": false, 00:16:53.360 "nvme_io": false 00:16:53.360 }, 00:16:53.360 "memory_domains": [ 00:16:53.360 { 00:16:53.360 "dma_device_id": "system", 00:16:53.360 "dma_device_type": 1 00:16:53.360 }, 00:16:53.360 { 00:16:53.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.360 "dma_device_type": 2 00:16:53.360 } 00:16:53.360 ], 00:16:53.360 "driver_specific": {} 00:16:53.360 }' 00:16:53.360 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.360 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.360 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.360 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.620 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.620 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.620 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.620 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.620 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.620 13:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.620 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.620 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.620 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.620 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:53.620 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.880 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.880 "name": "BaseBdev2", 00:16:53.880 "aliases": [ 00:16:53.880 "fb477de8-dbfe-49e6-9ab3-868284767e0d" 00:16:53.880 ], 00:16:53.880 "product_name": "Malloc disk", 00:16:53.880 "block_size": 512, 00:16:53.880 "num_blocks": 65536, 00:16:53.880 "uuid": "fb477de8-dbfe-49e6-9ab3-868284767e0d", 00:16:53.880 "assigned_rate_limits": { 00:16:53.880 "rw_ios_per_sec": 0, 00:16:53.880 "rw_mbytes_per_sec": 0, 00:16:53.880 "r_mbytes_per_sec": 0, 00:16:53.880 "w_mbytes_per_sec": 0 00:16:53.880 }, 00:16:53.880 "claimed": true, 00:16:53.880 "claim_type": "exclusive_write", 00:16:53.881 "zoned": false, 00:16:53.881 "supported_io_types": { 00:16:53.881 "read": true, 00:16:53.881 "write": true, 00:16:53.881 "unmap": true, 00:16:53.881 "write_zeroes": true, 00:16:53.881 "flush": true, 00:16:53.881 "reset": true, 00:16:53.881 "compare": false, 00:16:53.881 "compare_and_write": false, 00:16:53.881 "abort": true, 00:16:53.881 "nvme_admin": false, 00:16:53.881 "nvme_io": false 00:16:53.881 }, 00:16:53.881 "memory_domains": [ 00:16:53.881 { 00:16:53.881 "dma_device_id": "system", 00:16:53.881 "dma_device_type": 1 00:16:53.881 }, 00:16:53.881 { 00:16:53.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.881 "dma_device_type": 2 00:16:53.881 } 00:16:53.881 ], 00:16:53.881 "driver_specific": {} 00:16:53.881 }' 00:16:53.881 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.881 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.881 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.881 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:54.141 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.401 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.401 "name": "BaseBdev3", 00:16:54.401 "aliases": [ 00:16:54.401 "8cb52624-8517-46c1-863d-20cfdd31a4e3" 00:16:54.401 ], 00:16:54.401 "product_name": "Malloc disk", 00:16:54.401 "block_size": 512, 00:16:54.401 "num_blocks": 65536, 00:16:54.401 "uuid": "8cb52624-8517-46c1-863d-20cfdd31a4e3", 00:16:54.401 "assigned_rate_limits": { 00:16:54.401 "rw_ios_per_sec": 0, 00:16:54.401 "rw_mbytes_per_sec": 0, 00:16:54.401 "r_mbytes_per_sec": 0, 00:16:54.401 "w_mbytes_per_sec": 0 00:16:54.401 }, 00:16:54.401 "claimed": true, 00:16:54.401 "claim_type": "exclusive_write", 00:16:54.401 "zoned": false, 00:16:54.401 "supported_io_types": { 00:16:54.401 "read": true, 00:16:54.401 "write": true, 00:16:54.401 "unmap": true, 00:16:54.401 "write_zeroes": true, 00:16:54.401 "flush": true, 00:16:54.401 "reset": true, 00:16:54.401 "compare": false, 00:16:54.401 "compare_and_write": false, 00:16:54.401 "abort": true, 00:16:54.401 "nvme_admin": false, 00:16:54.401 "nvme_io": false 00:16:54.401 }, 00:16:54.401 "memory_domains": [ 00:16:54.401 { 00:16:54.401 "dma_device_id": "system", 00:16:54.401 "dma_device_type": 1 00:16:54.401 }, 00:16:54.401 { 00:16:54.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.401 "dma_device_type": 2 00:16:54.401 } 00:16:54.401 ], 00:16:54.401 "driver_specific": {} 00:16:54.401 }' 00:16:54.401 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.401 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.661 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.661 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.661 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.661 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.661 13:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.661 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.661 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.661 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.661 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.922 "name": "BaseBdev4", 00:16:54.922 "aliases": [ 00:16:54.922 "3c1e1165-f67a-4b69-b2a8-ace1020cda70" 00:16:54.922 ], 00:16:54.922 "product_name": "Malloc disk", 00:16:54.922 "block_size": 512, 00:16:54.922 "num_blocks": 65536, 00:16:54.922 "uuid": "3c1e1165-f67a-4b69-b2a8-ace1020cda70", 00:16:54.922 "assigned_rate_limits": { 00:16:54.922 "rw_ios_per_sec": 0, 00:16:54.922 "rw_mbytes_per_sec": 0, 00:16:54.922 "r_mbytes_per_sec": 0, 00:16:54.922 "w_mbytes_per_sec": 0 00:16:54.922 }, 00:16:54.922 "claimed": true, 00:16:54.922 "claim_type": "exclusive_write", 00:16:54.922 "zoned": false, 00:16:54.922 "supported_io_types": { 00:16:54.922 "read": true, 00:16:54.922 "write": true, 00:16:54.922 "unmap": true, 00:16:54.922 "write_zeroes": true, 00:16:54.922 "flush": true, 00:16:54.922 "reset": true, 00:16:54.922 "compare": false, 00:16:54.922 "compare_and_write": false, 00:16:54.922 "abort": true, 00:16:54.922 "nvme_admin": false, 00:16:54.922 "nvme_io": false 00:16:54.922 }, 00:16:54.922 "memory_domains": [ 00:16:54.922 { 00:16:54.922 "dma_device_id": "system", 00:16:54.922 "dma_device_type": 1 00:16:54.922 }, 00:16:54.922 { 00:16:54.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.922 "dma_device_type": 2 00:16:54.922 } 00:16:54.922 ], 00:16:54.922 "driver_specific": {} 00:16:54.922 }' 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.922 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.182 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.443 [2024-06-10 13:45:09.889065] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.443 [2024-06-10 13:45:09.889085] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.443 [2024-06-10 13:45:09.889127] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.443 [2024-06-10 13:45:09.889181] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.443 [2024-06-10 13:45:09.889188] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195c030 name Existed_Raid, state offline 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1581353 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1581353 ']' 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1581353 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:55.443 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1581353 00:16:55.704 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:55.704 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:55.704 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1581353' 00:16:55.704 killing process with pid 1581353 00:16:55.704 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1581353 00:16:55.704 [2024-06-10 13:45:09.966702] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:55.704 13:45:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1581353 00:16:55.704 [2024-06-10 13:45:09.988069] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.704 13:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.704 00:16:55.704 real 0m27.770s 00:16:55.704 user 0m52.017s 00:16:55.704 sys 0m4.090s 00:16:55.704 13:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:55.704 13:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.704 ************************************ 00:16:55.704 END TEST raid_state_function_test 00:16:55.704 ************************************ 00:16:55.704 13:45:10 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:16:55.704 13:45:10 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:55.704 13:45:10 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:55.704 13:45:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:55.964 ************************************ 00:16:55.964 START TEST raid_state_function_test_sb 00:16:55.964 ************************************ 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 true 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.964 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1587913 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1587913' 00:16:55.965 Process raid pid: 1587913 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1587913 /var/tmp/spdk-raid.sock 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1587913 ']' 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:55.965 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:55.965 13:45:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.965 [2024-06-10 13:45:10.257226] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:16:55.965 [2024-06-10 13:45:10.257277] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:55.965 [2024-06-10 13:45:10.348179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:55.965 [2024-06-10 13:45:10.415475] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:16:56.225 [2024-06-10 13:45:10.465823] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.225 [2024-06-10 13:45:10.465847] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.796 13:45:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:56.796 13:45:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:16:56.796 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:57.056 [2024-06-10 13:45:11.285736] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:57.056 [2024-06-10 13:45:11.285764] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:57.056 [2024-06-10 13:45:11.285771] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:57.056 [2024-06-10 13:45:11.285777] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:57.056 [2024-06-10 13:45:11.285782] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:57.056 [2024-06-10 13:45:11.285788] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:57.056 [2024-06-10 13:45:11.285793] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:57.056 [2024-06-10 13:45:11.285799] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.056 "name": "Existed_Raid", 00:16:57.056 "uuid": "032ffd21-061f-46b9-99f3-70c950cef6c4", 00:16:57.056 "strip_size_kb": 64, 00:16:57.056 "state": "configuring", 00:16:57.056 "raid_level": "concat", 00:16:57.056 "superblock": true, 00:16:57.056 "num_base_bdevs": 4, 00:16:57.056 "num_base_bdevs_discovered": 0, 00:16:57.056 "num_base_bdevs_operational": 4, 00:16:57.056 "base_bdevs_list": [ 00:16:57.056 { 00:16:57.056 "name": "BaseBdev1", 00:16:57.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.056 "is_configured": false, 00:16:57.056 "data_offset": 0, 00:16:57.056 "data_size": 0 00:16:57.056 }, 00:16:57.056 { 00:16:57.056 "name": "BaseBdev2", 00:16:57.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.056 "is_configured": false, 00:16:57.056 "data_offset": 0, 00:16:57.056 "data_size": 0 00:16:57.056 }, 00:16:57.056 { 00:16:57.056 "name": "BaseBdev3", 00:16:57.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.056 "is_configured": false, 00:16:57.056 "data_offset": 0, 00:16:57.056 "data_size": 0 00:16:57.056 }, 00:16:57.056 { 00:16:57.056 "name": "BaseBdev4", 00:16:57.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.056 "is_configured": false, 00:16:57.056 "data_offset": 0, 00:16:57.056 "data_size": 0 00:16:57.056 } 00:16:57.056 ] 00:16:57.056 }' 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.056 13:45:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:57.626 13:45:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:57.885 [2024-06-10 13:45:12.236006] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:57.885 [2024-06-10 13:45:12.236025] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b9890 name Existed_Raid, state configuring 00:16:57.886 13:45:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:58.146 [2024-06-10 13:45:12.452585] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:58.146 [2024-06-10 13:45:12.452603] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:58.146 [2024-06-10 13:45:12.452608] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:58.146 [2024-06-10 13:45:12.452614] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:58.146 [2024-06-10 13:45:12.452619] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:58.146 [2024-06-10 13:45:12.452625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:58.147 [2024-06-10 13:45:12.452630] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:58.147 [2024-06-10 13:45:12.452636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:58.147 13:45:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:58.407 [2024-06-10 13:45:12.660005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:58.407 BaseBdev1 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:58.407 13:45:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:58.666 [ 00:16:58.666 { 00:16:58.666 "name": "BaseBdev1", 00:16:58.666 "aliases": [ 00:16:58.666 "1e2d751d-85d9-4d37-a92f-49a97e54c0d3" 00:16:58.666 ], 00:16:58.666 "product_name": "Malloc disk", 00:16:58.666 "block_size": 512, 00:16:58.666 "num_blocks": 65536, 00:16:58.666 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:16:58.666 "assigned_rate_limits": { 00:16:58.666 "rw_ios_per_sec": 0, 00:16:58.666 "rw_mbytes_per_sec": 0, 00:16:58.666 "r_mbytes_per_sec": 0, 00:16:58.666 "w_mbytes_per_sec": 0 00:16:58.666 }, 00:16:58.666 "claimed": true, 00:16:58.666 "claim_type": "exclusive_write", 00:16:58.666 "zoned": false, 00:16:58.666 "supported_io_types": { 00:16:58.666 "read": true, 00:16:58.666 "write": true, 00:16:58.666 "unmap": true, 00:16:58.666 "write_zeroes": true, 00:16:58.666 "flush": true, 00:16:58.666 "reset": true, 00:16:58.666 "compare": false, 00:16:58.666 "compare_and_write": false, 00:16:58.666 "abort": true, 00:16:58.666 "nvme_admin": false, 00:16:58.666 "nvme_io": false 00:16:58.666 }, 00:16:58.666 "memory_domains": [ 00:16:58.666 { 00:16:58.666 "dma_device_id": "system", 00:16:58.666 "dma_device_type": 1 00:16:58.666 }, 00:16:58.666 { 00:16:58.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.666 "dma_device_type": 2 00:16:58.666 } 00:16:58.666 ], 00:16:58.666 "driver_specific": {} 00:16:58.666 } 00:16:58.666 ] 00:16:58.666 13:45:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:58.666 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:58.666 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.666 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.666 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:58.666 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.667 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.926 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.926 "name": "Existed_Raid", 00:16:58.926 "uuid": "3fa92fc6-107f-4d97-a771-7bb9f5c2a45c", 00:16:58.926 "strip_size_kb": 64, 00:16:58.926 "state": "configuring", 00:16:58.926 "raid_level": "concat", 00:16:58.926 "superblock": true, 00:16:58.926 "num_base_bdevs": 4, 00:16:58.926 "num_base_bdevs_discovered": 1, 00:16:58.926 "num_base_bdevs_operational": 4, 00:16:58.926 "base_bdevs_list": [ 00:16:58.926 { 00:16:58.926 "name": "BaseBdev1", 00:16:58.926 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:16:58.926 "is_configured": true, 00:16:58.926 "data_offset": 2048, 00:16:58.926 "data_size": 63488 00:16:58.926 }, 00:16:58.926 { 00:16:58.926 "name": "BaseBdev2", 00:16:58.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.926 "is_configured": false, 00:16:58.926 "data_offset": 0, 00:16:58.926 "data_size": 0 00:16:58.926 }, 00:16:58.926 { 00:16:58.926 "name": "BaseBdev3", 00:16:58.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.926 "is_configured": false, 00:16:58.926 "data_offset": 0, 00:16:58.926 "data_size": 0 00:16:58.926 }, 00:16:58.926 { 00:16:58.926 "name": "BaseBdev4", 00:16:58.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.926 "is_configured": false, 00:16:58.926 "data_offset": 0, 00:16:58.926 "data_size": 0 00:16:58.926 } 00:16:58.926 ] 00:16:58.926 }' 00:16:58.926 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.926 13:45:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:59.496 13:45:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:59.755 [2024-06-10 13:45:14.015426] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:59.755 [2024-06-10 13:45:14.015458] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24b9100 name Existed_Raid, state configuring 00:16:59.756 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:59.756 [2024-06-10 13:45:14.219983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:59.756 [2024-06-10 13:45:14.221175] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:59.756 [2024-06-10 13:45:14.221198] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:59.756 [2024-06-10 13:45:14.221204] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:59.756 [2024-06-10 13:45:14.221210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:59.756 [2024-06-10 13:45:14.221216] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:59.756 [2024-06-10 13:45:14.221222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.016 "name": "Existed_Raid", 00:17:00.016 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:00.016 "strip_size_kb": 64, 00:17:00.016 "state": "configuring", 00:17:00.016 "raid_level": "concat", 00:17:00.016 "superblock": true, 00:17:00.016 "num_base_bdevs": 4, 00:17:00.016 "num_base_bdevs_discovered": 1, 00:17:00.016 "num_base_bdevs_operational": 4, 00:17:00.016 "base_bdevs_list": [ 00:17:00.016 { 00:17:00.016 "name": "BaseBdev1", 00:17:00.016 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:17:00.016 "is_configured": true, 00:17:00.016 "data_offset": 2048, 00:17:00.016 "data_size": 63488 00:17:00.016 }, 00:17:00.016 { 00:17:00.016 "name": "BaseBdev2", 00:17:00.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.016 "is_configured": false, 00:17:00.016 "data_offset": 0, 00:17:00.016 "data_size": 0 00:17:00.016 }, 00:17:00.016 { 00:17:00.016 "name": "BaseBdev3", 00:17:00.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.016 "is_configured": false, 00:17:00.016 "data_offset": 0, 00:17:00.016 "data_size": 0 00:17:00.016 }, 00:17:00.016 { 00:17:00.016 "name": "BaseBdev4", 00:17:00.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.016 "is_configured": false, 00:17:00.016 "data_offset": 0, 00:17:00.016 "data_size": 0 00:17:00.016 } 00:17:00.016 ] 00:17:00.016 }' 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.016 13:45:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.586 13:45:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:00.846 [2024-06-10 13:45:15.175499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:00.846 BaseBdev2 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:00.846 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:01.106 [ 00:17:01.106 { 00:17:01.106 "name": "BaseBdev2", 00:17:01.106 "aliases": [ 00:17:01.106 "175de072-022b-45ca-bdc5-480629cd5eb3" 00:17:01.106 ], 00:17:01.106 "product_name": "Malloc disk", 00:17:01.106 "block_size": 512, 00:17:01.106 "num_blocks": 65536, 00:17:01.106 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:01.106 "assigned_rate_limits": { 00:17:01.106 "rw_ios_per_sec": 0, 00:17:01.106 "rw_mbytes_per_sec": 0, 00:17:01.106 "r_mbytes_per_sec": 0, 00:17:01.106 "w_mbytes_per_sec": 0 00:17:01.106 }, 00:17:01.106 "claimed": true, 00:17:01.106 "claim_type": "exclusive_write", 00:17:01.106 "zoned": false, 00:17:01.106 "supported_io_types": { 00:17:01.106 "read": true, 00:17:01.106 "write": true, 00:17:01.106 "unmap": true, 00:17:01.106 "write_zeroes": true, 00:17:01.106 "flush": true, 00:17:01.106 "reset": true, 00:17:01.106 "compare": false, 00:17:01.106 "compare_and_write": false, 00:17:01.106 "abort": true, 00:17:01.106 "nvme_admin": false, 00:17:01.106 "nvme_io": false 00:17:01.106 }, 00:17:01.106 "memory_domains": [ 00:17:01.106 { 00:17:01.106 "dma_device_id": "system", 00:17:01.106 "dma_device_type": 1 00:17:01.106 }, 00:17:01.106 { 00:17:01.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.106 "dma_device_type": 2 00:17:01.106 } 00:17:01.106 ], 00:17:01.106 "driver_specific": {} 00:17:01.106 } 00:17:01.106 ] 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.106 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.367 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.367 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.367 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.367 "name": "Existed_Raid", 00:17:01.367 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:01.367 "strip_size_kb": 64, 00:17:01.367 "state": "configuring", 00:17:01.367 "raid_level": "concat", 00:17:01.367 "superblock": true, 00:17:01.367 "num_base_bdevs": 4, 00:17:01.367 "num_base_bdevs_discovered": 2, 00:17:01.367 "num_base_bdevs_operational": 4, 00:17:01.367 "base_bdevs_list": [ 00:17:01.367 { 00:17:01.367 "name": "BaseBdev1", 00:17:01.367 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:17:01.367 "is_configured": true, 00:17:01.367 "data_offset": 2048, 00:17:01.367 "data_size": 63488 00:17:01.367 }, 00:17:01.367 { 00:17:01.367 "name": "BaseBdev2", 00:17:01.367 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:01.367 "is_configured": true, 00:17:01.367 "data_offset": 2048, 00:17:01.367 "data_size": 63488 00:17:01.367 }, 00:17:01.367 { 00:17:01.367 "name": "BaseBdev3", 00:17:01.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.367 "is_configured": false, 00:17:01.367 "data_offset": 0, 00:17:01.367 "data_size": 0 00:17:01.367 }, 00:17:01.367 { 00:17:01.367 "name": "BaseBdev4", 00:17:01.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.367 "is_configured": false, 00:17:01.367 "data_offset": 0, 00:17:01.367 "data_size": 0 00:17:01.367 } 00:17:01.367 ] 00:17:01.367 }' 00:17:01.367 13:45:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.367 13:45:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:01.938 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:02.199 [2024-06-10 13:45:16.520026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:02.199 BaseBdev3 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:02.199 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:02.459 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:02.459 [ 00:17:02.459 { 00:17:02.459 "name": "BaseBdev3", 00:17:02.459 "aliases": [ 00:17:02.459 "0b569f23-654e-421a-b6de-c631c71672f5" 00:17:02.459 ], 00:17:02.459 "product_name": "Malloc disk", 00:17:02.459 "block_size": 512, 00:17:02.459 "num_blocks": 65536, 00:17:02.459 "uuid": "0b569f23-654e-421a-b6de-c631c71672f5", 00:17:02.459 "assigned_rate_limits": { 00:17:02.459 "rw_ios_per_sec": 0, 00:17:02.459 "rw_mbytes_per_sec": 0, 00:17:02.459 "r_mbytes_per_sec": 0, 00:17:02.459 "w_mbytes_per_sec": 0 00:17:02.459 }, 00:17:02.459 "claimed": true, 00:17:02.459 "claim_type": "exclusive_write", 00:17:02.459 "zoned": false, 00:17:02.459 "supported_io_types": { 00:17:02.459 "read": true, 00:17:02.459 "write": true, 00:17:02.459 "unmap": true, 00:17:02.459 "write_zeroes": true, 00:17:02.459 "flush": true, 00:17:02.459 "reset": true, 00:17:02.459 "compare": false, 00:17:02.459 "compare_and_write": false, 00:17:02.459 "abort": true, 00:17:02.459 "nvme_admin": false, 00:17:02.459 "nvme_io": false 00:17:02.459 }, 00:17:02.459 "memory_domains": [ 00:17:02.459 { 00:17:02.459 "dma_device_id": "system", 00:17:02.459 "dma_device_type": 1 00:17:02.459 }, 00:17:02.459 { 00:17:02.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.459 "dma_device_type": 2 00:17:02.459 } 00:17:02.459 ], 00:17:02.459 "driver_specific": {} 00:17:02.459 } 00:17:02.459 ] 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.720 13:45:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.720 13:45:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.720 "name": "Existed_Raid", 00:17:02.720 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:02.720 "strip_size_kb": 64, 00:17:02.720 "state": "configuring", 00:17:02.720 "raid_level": "concat", 00:17:02.720 "superblock": true, 00:17:02.720 "num_base_bdevs": 4, 00:17:02.720 "num_base_bdevs_discovered": 3, 00:17:02.720 "num_base_bdevs_operational": 4, 00:17:02.720 "base_bdevs_list": [ 00:17:02.720 { 00:17:02.720 "name": "BaseBdev1", 00:17:02.720 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:17:02.720 "is_configured": true, 00:17:02.720 "data_offset": 2048, 00:17:02.720 "data_size": 63488 00:17:02.720 }, 00:17:02.720 { 00:17:02.720 "name": "BaseBdev2", 00:17:02.720 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:02.720 "is_configured": true, 00:17:02.720 "data_offset": 2048, 00:17:02.720 "data_size": 63488 00:17:02.720 }, 00:17:02.720 { 00:17:02.720 "name": "BaseBdev3", 00:17:02.720 "uuid": "0b569f23-654e-421a-b6de-c631c71672f5", 00:17:02.720 "is_configured": true, 00:17:02.720 "data_offset": 2048, 00:17:02.720 "data_size": 63488 00:17:02.720 }, 00:17:02.720 { 00:17:02.720 "name": "BaseBdev4", 00:17:02.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.720 "is_configured": false, 00:17:02.720 "data_offset": 0, 00:17:02.720 "data_size": 0 00:17:02.720 } 00:17:02.720 ] 00:17:02.720 }' 00:17:02.720 13:45:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.720 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:03.310 13:45:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:03.570 [2024-06-10 13:45:17.876499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:03.570 [2024-06-10 13:45:17.876623] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24ba160 00:17:03.570 [2024-06-10 13:45:17.876632] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:03.570 [2024-06-10 13:45:17.876775] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24a5f20 00:17:03.570 [2024-06-10 13:45:17.876872] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24ba160 00:17:03.570 [2024-06-10 13:45:17.876878] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24ba160 00:17:03.570 [2024-06-10 13:45:17.876950] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:03.570 BaseBdev4 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:03.570 13:45:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:03.830 [ 00:17:03.830 { 00:17:03.830 "name": "BaseBdev4", 00:17:03.830 "aliases": [ 00:17:03.830 "033357b4-3f35-4351-b559-5c7b2a737d0b" 00:17:03.830 ], 00:17:03.830 "product_name": "Malloc disk", 00:17:03.830 "block_size": 512, 00:17:03.830 "num_blocks": 65536, 00:17:03.830 "uuid": "033357b4-3f35-4351-b559-5c7b2a737d0b", 00:17:03.830 "assigned_rate_limits": { 00:17:03.830 "rw_ios_per_sec": 0, 00:17:03.830 "rw_mbytes_per_sec": 0, 00:17:03.830 "r_mbytes_per_sec": 0, 00:17:03.830 "w_mbytes_per_sec": 0 00:17:03.830 }, 00:17:03.830 "claimed": true, 00:17:03.830 "claim_type": "exclusive_write", 00:17:03.830 "zoned": false, 00:17:03.830 "supported_io_types": { 00:17:03.830 "read": true, 00:17:03.830 "write": true, 00:17:03.830 "unmap": true, 00:17:03.830 "write_zeroes": true, 00:17:03.830 "flush": true, 00:17:03.830 "reset": true, 00:17:03.830 "compare": false, 00:17:03.830 "compare_and_write": false, 00:17:03.830 "abort": true, 00:17:03.830 "nvme_admin": false, 00:17:03.830 "nvme_io": false 00:17:03.830 }, 00:17:03.830 "memory_domains": [ 00:17:03.830 { 00:17:03.830 "dma_device_id": "system", 00:17:03.830 "dma_device_type": 1 00:17:03.830 }, 00:17:03.830 { 00:17:03.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.830 "dma_device_type": 2 00:17:03.830 } 00:17:03.830 ], 00:17:03.830 "driver_specific": {} 00:17:03.830 } 00:17:03.830 ] 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.830 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.090 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.090 "name": "Existed_Raid", 00:17:04.090 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:04.090 "strip_size_kb": 64, 00:17:04.090 "state": "online", 00:17:04.090 "raid_level": "concat", 00:17:04.090 "superblock": true, 00:17:04.090 "num_base_bdevs": 4, 00:17:04.090 "num_base_bdevs_discovered": 4, 00:17:04.090 "num_base_bdevs_operational": 4, 00:17:04.090 "base_bdevs_list": [ 00:17:04.090 { 00:17:04.090 "name": "BaseBdev1", 00:17:04.090 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:17:04.090 "is_configured": true, 00:17:04.090 "data_offset": 2048, 00:17:04.090 "data_size": 63488 00:17:04.090 }, 00:17:04.090 { 00:17:04.090 "name": "BaseBdev2", 00:17:04.090 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:04.090 "is_configured": true, 00:17:04.090 "data_offset": 2048, 00:17:04.090 "data_size": 63488 00:17:04.090 }, 00:17:04.090 { 00:17:04.090 "name": "BaseBdev3", 00:17:04.090 "uuid": "0b569f23-654e-421a-b6de-c631c71672f5", 00:17:04.090 "is_configured": true, 00:17:04.090 "data_offset": 2048, 00:17:04.090 "data_size": 63488 00:17:04.090 }, 00:17:04.090 { 00:17:04.090 "name": "BaseBdev4", 00:17:04.090 "uuid": "033357b4-3f35-4351-b559-5c7b2a737d0b", 00:17:04.090 "is_configured": true, 00:17:04.090 "data_offset": 2048, 00:17:04.090 "data_size": 63488 00:17:04.090 } 00:17:04.090 ] 00:17:04.090 }' 00:17:04.090 13:45:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.090 13:45:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:04.660 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:04.920 [2024-06-10 13:45:19.204122] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:04.920 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:04.920 "name": "Existed_Raid", 00:17:04.920 "aliases": [ 00:17:04.920 "c3ab2c42-a5d7-4717-9d6b-761ac24efec6" 00:17:04.920 ], 00:17:04.920 "product_name": "Raid Volume", 00:17:04.920 "block_size": 512, 00:17:04.920 "num_blocks": 253952, 00:17:04.920 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:04.920 "assigned_rate_limits": { 00:17:04.920 "rw_ios_per_sec": 0, 00:17:04.920 "rw_mbytes_per_sec": 0, 00:17:04.920 "r_mbytes_per_sec": 0, 00:17:04.920 "w_mbytes_per_sec": 0 00:17:04.920 }, 00:17:04.920 "claimed": false, 00:17:04.920 "zoned": false, 00:17:04.920 "supported_io_types": { 00:17:04.920 "read": true, 00:17:04.920 "write": true, 00:17:04.920 "unmap": true, 00:17:04.920 "write_zeroes": true, 00:17:04.920 "flush": true, 00:17:04.920 "reset": true, 00:17:04.920 "compare": false, 00:17:04.920 "compare_and_write": false, 00:17:04.920 "abort": false, 00:17:04.920 "nvme_admin": false, 00:17:04.920 "nvme_io": false 00:17:04.920 }, 00:17:04.920 "memory_domains": [ 00:17:04.920 { 00:17:04.920 "dma_device_id": "system", 00:17:04.920 "dma_device_type": 1 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.920 "dma_device_type": 2 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "system", 00:17:04.920 "dma_device_type": 1 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.920 "dma_device_type": 2 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "system", 00:17:04.920 "dma_device_type": 1 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.920 "dma_device_type": 2 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "system", 00:17:04.920 "dma_device_type": 1 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.920 "dma_device_type": 2 00:17:04.920 } 00:17:04.920 ], 00:17:04.920 "driver_specific": { 00:17:04.920 "raid": { 00:17:04.920 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:04.920 "strip_size_kb": 64, 00:17:04.920 "state": "online", 00:17:04.920 "raid_level": "concat", 00:17:04.920 "superblock": true, 00:17:04.920 "num_base_bdevs": 4, 00:17:04.920 "num_base_bdevs_discovered": 4, 00:17:04.920 "num_base_bdevs_operational": 4, 00:17:04.920 "base_bdevs_list": [ 00:17:04.920 { 00:17:04.920 "name": "BaseBdev1", 00:17:04.920 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:17:04.920 "is_configured": true, 00:17:04.920 "data_offset": 2048, 00:17:04.920 "data_size": 63488 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "name": "BaseBdev2", 00:17:04.920 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:04.920 "is_configured": true, 00:17:04.920 "data_offset": 2048, 00:17:04.920 "data_size": 63488 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "name": "BaseBdev3", 00:17:04.920 "uuid": "0b569f23-654e-421a-b6de-c631c71672f5", 00:17:04.920 "is_configured": true, 00:17:04.920 "data_offset": 2048, 00:17:04.920 "data_size": 63488 00:17:04.920 }, 00:17:04.920 { 00:17:04.920 "name": "BaseBdev4", 00:17:04.920 "uuid": "033357b4-3f35-4351-b559-5c7b2a737d0b", 00:17:04.920 "is_configured": true, 00:17:04.920 "data_offset": 2048, 00:17:04.920 "data_size": 63488 00:17:04.920 } 00:17:04.920 ] 00:17:04.920 } 00:17:04.920 } 00:17:04.920 }' 00:17:04.920 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:04.920 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:04.920 BaseBdev2 00:17:04.920 BaseBdev3 00:17:04.920 BaseBdev4' 00:17:04.920 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:04.920 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:04.920 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.180 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.180 "name": "BaseBdev1", 00:17:05.180 "aliases": [ 00:17:05.180 "1e2d751d-85d9-4d37-a92f-49a97e54c0d3" 00:17:05.180 ], 00:17:05.180 "product_name": "Malloc disk", 00:17:05.180 "block_size": 512, 00:17:05.180 "num_blocks": 65536, 00:17:05.180 "uuid": "1e2d751d-85d9-4d37-a92f-49a97e54c0d3", 00:17:05.180 "assigned_rate_limits": { 00:17:05.180 "rw_ios_per_sec": 0, 00:17:05.180 "rw_mbytes_per_sec": 0, 00:17:05.180 "r_mbytes_per_sec": 0, 00:17:05.180 "w_mbytes_per_sec": 0 00:17:05.180 }, 00:17:05.180 "claimed": true, 00:17:05.180 "claim_type": "exclusive_write", 00:17:05.180 "zoned": false, 00:17:05.180 "supported_io_types": { 00:17:05.180 "read": true, 00:17:05.180 "write": true, 00:17:05.180 "unmap": true, 00:17:05.180 "write_zeroes": true, 00:17:05.180 "flush": true, 00:17:05.180 "reset": true, 00:17:05.180 "compare": false, 00:17:05.180 "compare_and_write": false, 00:17:05.180 "abort": true, 00:17:05.180 "nvme_admin": false, 00:17:05.180 "nvme_io": false 00:17:05.180 }, 00:17:05.180 "memory_domains": [ 00:17:05.180 { 00:17:05.180 "dma_device_id": "system", 00:17:05.180 "dma_device_type": 1 00:17:05.180 }, 00:17:05.180 { 00:17:05.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.180 "dma_device_type": 2 00:17:05.180 } 00:17:05.180 ], 00:17:05.180 "driver_specific": {} 00:17:05.180 }' 00:17:05.180 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.180 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.180 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.180 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.180 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.440 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:05.441 13:45:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.700 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.700 "name": "BaseBdev2", 00:17:05.700 "aliases": [ 00:17:05.700 "175de072-022b-45ca-bdc5-480629cd5eb3" 00:17:05.700 ], 00:17:05.700 "product_name": "Malloc disk", 00:17:05.700 "block_size": 512, 00:17:05.700 "num_blocks": 65536, 00:17:05.700 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:05.700 "assigned_rate_limits": { 00:17:05.700 "rw_ios_per_sec": 0, 00:17:05.700 "rw_mbytes_per_sec": 0, 00:17:05.700 "r_mbytes_per_sec": 0, 00:17:05.700 "w_mbytes_per_sec": 0 00:17:05.700 }, 00:17:05.700 "claimed": true, 00:17:05.700 "claim_type": "exclusive_write", 00:17:05.700 "zoned": false, 00:17:05.700 "supported_io_types": { 00:17:05.700 "read": true, 00:17:05.700 "write": true, 00:17:05.700 "unmap": true, 00:17:05.700 "write_zeroes": true, 00:17:05.700 "flush": true, 00:17:05.700 "reset": true, 00:17:05.700 "compare": false, 00:17:05.700 "compare_and_write": false, 00:17:05.700 "abort": true, 00:17:05.700 "nvme_admin": false, 00:17:05.700 "nvme_io": false 00:17:05.700 }, 00:17:05.700 "memory_domains": [ 00:17:05.700 { 00:17:05.700 "dma_device_id": "system", 00:17:05.700 "dma_device_type": 1 00:17:05.700 }, 00:17:05.700 { 00:17:05.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.700 "dma_device_type": 2 00:17:05.700 } 00:17:05.700 ], 00:17:05.700 "driver_specific": {} 00:17:05.700 }' 00:17:05.700 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.700 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.700 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.700 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:05.961 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.221 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.221 "name": "BaseBdev3", 00:17:06.221 "aliases": [ 00:17:06.221 "0b569f23-654e-421a-b6de-c631c71672f5" 00:17:06.221 ], 00:17:06.221 "product_name": "Malloc disk", 00:17:06.221 "block_size": 512, 00:17:06.221 "num_blocks": 65536, 00:17:06.221 "uuid": "0b569f23-654e-421a-b6de-c631c71672f5", 00:17:06.221 "assigned_rate_limits": { 00:17:06.221 "rw_ios_per_sec": 0, 00:17:06.221 "rw_mbytes_per_sec": 0, 00:17:06.221 "r_mbytes_per_sec": 0, 00:17:06.221 "w_mbytes_per_sec": 0 00:17:06.221 }, 00:17:06.221 "claimed": true, 00:17:06.221 "claim_type": "exclusive_write", 00:17:06.221 "zoned": false, 00:17:06.221 "supported_io_types": { 00:17:06.221 "read": true, 00:17:06.221 "write": true, 00:17:06.221 "unmap": true, 00:17:06.221 "write_zeroes": true, 00:17:06.221 "flush": true, 00:17:06.221 "reset": true, 00:17:06.221 "compare": false, 00:17:06.221 "compare_and_write": false, 00:17:06.221 "abort": true, 00:17:06.221 "nvme_admin": false, 00:17:06.221 "nvme_io": false 00:17:06.221 }, 00:17:06.221 "memory_domains": [ 00:17:06.221 { 00:17:06.221 "dma_device_id": "system", 00:17:06.221 "dma_device_type": 1 00:17:06.221 }, 00:17:06.221 { 00:17:06.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.221 "dma_device_type": 2 00:17:06.221 } 00:17:06.221 ], 00:17:06.221 "driver_specific": {} 00:17:06.221 }' 00:17:06.221 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.221 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.480 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.740 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.740 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.740 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:06.740 13:45:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.740 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.740 "name": "BaseBdev4", 00:17:06.740 "aliases": [ 00:17:06.740 "033357b4-3f35-4351-b559-5c7b2a737d0b" 00:17:06.740 ], 00:17:06.740 "product_name": "Malloc disk", 00:17:06.740 "block_size": 512, 00:17:06.740 "num_blocks": 65536, 00:17:06.740 "uuid": "033357b4-3f35-4351-b559-5c7b2a737d0b", 00:17:06.740 "assigned_rate_limits": { 00:17:06.740 "rw_ios_per_sec": 0, 00:17:06.740 "rw_mbytes_per_sec": 0, 00:17:06.740 "r_mbytes_per_sec": 0, 00:17:06.740 "w_mbytes_per_sec": 0 00:17:06.740 }, 00:17:06.740 "claimed": true, 00:17:06.740 "claim_type": "exclusive_write", 00:17:06.740 "zoned": false, 00:17:06.740 "supported_io_types": { 00:17:06.740 "read": true, 00:17:06.740 "write": true, 00:17:06.740 "unmap": true, 00:17:06.740 "write_zeroes": true, 00:17:06.740 "flush": true, 00:17:06.740 "reset": true, 00:17:06.740 "compare": false, 00:17:06.740 "compare_and_write": false, 00:17:06.740 "abort": true, 00:17:06.740 "nvme_admin": false, 00:17:06.740 "nvme_io": false 00:17:06.740 }, 00:17:06.740 "memory_domains": [ 00:17:06.740 { 00:17:06.740 "dma_device_id": "system", 00:17:06.740 "dma_device_type": 1 00:17:06.740 }, 00:17:06.740 { 00:17:06.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.740 "dma_device_type": 2 00:17:06.740 } 00:17:06.740 ], 00:17:06.740 "driver_specific": {} 00:17:06.740 }' 00:17:06.740 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.740 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.000 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.001 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:07.261 [2024-06-10 13:45:21.678598] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:07.261 [2024-06-10 13:45:21.678617] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:07.261 [2024-06-10 13:45:21.678655] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.261 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.521 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.521 "name": "Existed_Raid", 00:17:07.521 "uuid": "c3ab2c42-a5d7-4717-9d6b-761ac24efec6", 00:17:07.521 "strip_size_kb": 64, 00:17:07.521 "state": "offline", 00:17:07.521 "raid_level": "concat", 00:17:07.521 "superblock": true, 00:17:07.521 "num_base_bdevs": 4, 00:17:07.521 "num_base_bdevs_discovered": 3, 00:17:07.521 "num_base_bdevs_operational": 3, 00:17:07.521 "base_bdevs_list": [ 00:17:07.521 { 00:17:07.521 "name": null, 00:17:07.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.521 "is_configured": false, 00:17:07.521 "data_offset": 2048, 00:17:07.521 "data_size": 63488 00:17:07.521 }, 00:17:07.521 { 00:17:07.521 "name": "BaseBdev2", 00:17:07.521 "uuid": "175de072-022b-45ca-bdc5-480629cd5eb3", 00:17:07.521 "is_configured": true, 00:17:07.521 "data_offset": 2048, 00:17:07.521 "data_size": 63488 00:17:07.521 }, 00:17:07.521 { 00:17:07.521 "name": "BaseBdev3", 00:17:07.521 "uuid": "0b569f23-654e-421a-b6de-c631c71672f5", 00:17:07.521 "is_configured": true, 00:17:07.521 "data_offset": 2048, 00:17:07.521 "data_size": 63488 00:17:07.521 }, 00:17:07.521 { 00:17:07.521 "name": "BaseBdev4", 00:17:07.521 "uuid": "033357b4-3f35-4351-b559-5c7b2a737d0b", 00:17:07.521 "is_configured": true, 00:17:07.521 "data_offset": 2048, 00:17:07.521 "data_size": 63488 00:17:07.521 } 00:17:07.521 ] 00:17:07.521 }' 00:17:07.521 13:45:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.521 13:45:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:08.090 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:08.090 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:08.090 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.090 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:08.350 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:08.350 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:08.350 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:08.610 [2024-06-10 13:45:22.841550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:08.610 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:08.610 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:08.610 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.610 13:45:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:08.610 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:08.610 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:08.610 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:08.870 [2024-06-10 13:45:23.252566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:08.870 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:08.870 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:08.870 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.870 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:09.131 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:09.131 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:09.131 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:09.391 [2024-06-10 13:45:23.663630] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:09.391 [2024-06-10 13:45:23.663660] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24ba160 name Existed_Raid, state offline 00:17:09.391 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:09.391 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:09.391 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.391 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:09.651 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:09.651 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:09.651 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:09.651 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:09.651 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:09.651 13:45:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:09.651 BaseBdev2 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:09.651 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:09.911 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:10.171 [ 00:17:10.171 { 00:17:10.171 "name": "BaseBdev2", 00:17:10.171 "aliases": [ 00:17:10.171 "e5fa943c-7cec-4a9a-a422-929a6baa2943" 00:17:10.171 ], 00:17:10.171 "product_name": "Malloc disk", 00:17:10.171 "block_size": 512, 00:17:10.171 "num_blocks": 65536, 00:17:10.171 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:10.171 "assigned_rate_limits": { 00:17:10.171 "rw_ios_per_sec": 0, 00:17:10.171 "rw_mbytes_per_sec": 0, 00:17:10.171 "r_mbytes_per_sec": 0, 00:17:10.171 "w_mbytes_per_sec": 0 00:17:10.171 }, 00:17:10.171 "claimed": false, 00:17:10.171 "zoned": false, 00:17:10.171 "supported_io_types": { 00:17:10.171 "read": true, 00:17:10.171 "write": true, 00:17:10.171 "unmap": true, 00:17:10.171 "write_zeroes": true, 00:17:10.171 "flush": true, 00:17:10.171 "reset": true, 00:17:10.171 "compare": false, 00:17:10.171 "compare_and_write": false, 00:17:10.171 "abort": true, 00:17:10.171 "nvme_admin": false, 00:17:10.171 "nvme_io": false 00:17:10.171 }, 00:17:10.171 "memory_domains": [ 00:17:10.171 { 00:17:10.171 "dma_device_id": "system", 00:17:10.171 "dma_device_type": 1 00:17:10.171 }, 00:17:10.171 { 00:17:10.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.171 "dma_device_type": 2 00:17:10.171 } 00:17:10.171 ], 00:17:10.171 "driver_specific": {} 00:17:10.171 } 00:17:10.171 ] 00:17:10.171 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:10.171 13:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:10.171 13:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:10.171 13:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:10.431 BaseBdev3 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.431 13:45:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:10.691 [ 00:17:10.691 { 00:17:10.691 "name": "BaseBdev3", 00:17:10.691 "aliases": [ 00:17:10.691 "5a16f325-29e3-4738-a5fd-9d5e117bef7e" 00:17:10.691 ], 00:17:10.691 "product_name": "Malloc disk", 00:17:10.691 "block_size": 512, 00:17:10.691 "num_blocks": 65536, 00:17:10.691 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:10.691 "assigned_rate_limits": { 00:17:10.691 "rw_ios_per_sec": 0, 00:17:10.691 "rw_mbytes_per_sec": 0, 00:17:10.691 "r_mbytes_per_sec": 0, 00:17:10.691 "w_mbytes_per_sec": 0 00:17:10.691 }, 00:17:10.691 "claimed": false, 00:17:10.691 "zoned": false, 00:17:10.691 "supported_io_types": { 00:17:10.691 "read": true, 00:17:10.691 "write": true, 00:17:10.691 "unmap": true, 00:17:10.691 "write_zeroes": true, 00:17:10.691 "flush": true, 00:17:10.691 "reset": true, 00:17:10.691 "compare": false, 00:17:10.691 "compare_and_write": false, 00:17:10.691 "abort": true, 00:17:10.691 "nvme_admin": false, 00:17:10.691 "nvme_io": false 00:17:10.691 }, 00:17:10.691 "memory_domains": [ 00:17:10.691 { 00:17:10.691 "dma_device_id": "system", 00:17:10.691 "dma_device_type": 1 00:17:10.691 }, 00:17:10.691 { 00:17:10.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.691 "dma_device_type": 2 00:17:10.691 } 00:17:10.691 ], 00:17:10.691 "driver_specific": {} 00:17:10.691 } 00:17:10.691 ] 00:17:10.691 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:10.691 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:10.691 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:10.691 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:10.952 BaseBdev4 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:10.952 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.213 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:11.213 [ 00:17:11.213 { 00:17:11.213 "name": "BaseBdev4", 00:17:11.213 "aliases": [ 00:17:11.213 "eba3155f-3e21-45fa-a193-8bbaeca22740" 00:17:11.213 ], 00:17:11.213 "product_name": "Malloc disk", 00:17:11.213 "block_size": 512, 00:17:11.213 "num_blocks": 65536, 00:17:11.213 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:11.213 "assigned_rate_limits": { 00:17:11.213 "rw_ios_per_sec": 0, 00:17:11.213 "rw_mbytes_per_sec": 0, 00:17:11.213 "r_mbytes_per_sec": 0, 00:17:11.213 "w_mbytes_per_sec": 0 00:17:11.213 }, 00:17:11.213 "claimed": false, 00:17:11.213 "zoned": false, 00:17:11.213 "supported_io_types": { 00:17:11.213 "read": true, 00:17:11.213 "write": true, 00:17:11.213 "unmap": true, 00:17:11.213 "write_zeroes": true, 00:17:11.213 "flush": true, 00:17:11.213 "reset": true, 00:17:11.213 "compare": false, 00:17:11.213 "compare_and_write": false, 00:17:11.213 "abort": true, 00:17:11.213 "nvme_admin": false, 00:17:11.213 "nvme_io": false 00:17:11.213 }, 00:17:11.213 "memory_domains": [ 00:17:11.213 { 00:17:11.213 "dma_device_id": "system", 00:17:11.213 "dma_device_type": 1 00:17:11.213 }, 00:17:11.213 { 00:17:11.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.213 "dma_device_type": 2 00:17:11.213 } 00:17:11.213 ], 00:17:11.213 "driver_specific": {} 00:17:11.213 } 00:17:11.213 ] 00:17:11.213 13:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:11.213 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:11.213 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:11.213 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:11.473 [2024-06-10 13:45:25.859643] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:11.473 [2024-06-10 13:45:25.859672] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:11.473 [2024-06-10 13:45:25.859685] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:11.473 [2024-06-10 13:45:25.860793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:11.473 [2024-06-10 13:45:25.860827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.473 13:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.733 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.733 "name": "Existed_Raid", 00:17:11.733 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:11.733 "strip_size_kb": 64, 00:17:11.733 "state": "configuring", 00:17:11.733 "raid_level": "concat", 00:17:11.733 "superblock": true, 00:17:11.733 "num_base_bdevs": 4, 00:17:11.733 "num_base_bdevs_discovered": 3, 00:17:11.733 "num_base_bdevs_operational": 4, 00:17:11.733 "base_bdevs_list": [ 00:17:11.733 { 00:17:11.733 "name": "BaseBdev1", 00:17:11.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.733 "is_configured": false, 00:17:11.733 "data_offset": 0, 00:17:11.733 "data_size": 0 00:17:11.733 }, 00:17:11.733 { 00:17:11.733 "name": "BaseBdev2", 00:17:11.733 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:11.733 "is_configured": true, 00:17:11.733 "data_offset": 2048, 00:17:11.733 "data_size": 63488 00:17:11.733 }, 00:17:11.733 { 00:17:11.733 "name": "BaseBdev3", 00:17:11.733 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:11.733 "is_configured": true, 00:17:11.733 "data_offset": 2048, 00:17:11.733 "data_size": 63488 00:17:11.733 }, 00:17:11.733 { 00:17:11.733 "name": "BaseBdev4", 00:17:11.733 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:11.733 "is_configured": true, 00:17:11.733 "data_offset": 2048, 00:17:11.733 "data_size": 63488 00:17:11.733 } 00:17:11.733 ] 00:17:11.733 }' 00:17:11.733 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.733 13:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:12.304 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:12.565 [2024-06-10 13:45:26.818051] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.565 13:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.565 13:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.565 "name": "Existed_Raid", 00:17:12.565 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:12.565 "strip_size_kb": 64, 00:17:12.565 "state": "configuring", 00:17:12.565 "raid_level": "concat", 00:17:12.565 "superblock": true, 00:17:12.565 "num_base_bdevs": 4, 00:17:12.565 "num_base_bdevs_discovered": 2, 00:17:12.565 "num_base_bdevs_operational": 4, 00:17:12.565 "base_bdevs_list": [ 00:17:12.565 { 00:17:12.565 "name": "BaseBdev1", 00:17:12.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.565 "is_configured": false, 00:17:12.565 "data_offset": 0, 00:17:12.565 "data_size": 0 00:17:12.565 }, 00:17:12.565 { 00:17:12.565 "name": null, 00:17:12.565 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:12.565 "is_configured": false, 00:17:12.565 "data_offset": 2048, 00:17:12.565 "data_size": 63488 00:17:12.565 }, 00:17:12.565 { 00:17:12.565 "name": "BaseBdev3", 00:17:12.565 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:12.565 "is_configured": true, 00:17:12.565 "data_offset": 2048, 00:17:12.565 "data_size": 63488 00:17:12.565 }, 00:17:12.565 { 00:17:12.565 "name": "BaseBdev4", 00:17:12.565 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:12.565 "is_configured": true, 00:17:12.565 "data_offset": 2048, 00:17:12.565 "data_size": 63488 00:17:12.565 } 00:17:12.565 ] 00:17:12.565 }' 00:17:12.565 13:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.565 13:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.137 13:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.137 13:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:13.397 13:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:13.397 13:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:13.657 [2024-06-10 13:45:27.998171] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.657 BaseBdev1 00:17:13.657 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:13.658 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:17:13.658 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:13.658 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:13.658 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:13.658 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:13.658 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:13.918 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:13.918 [ 00:17:13.918 { 00:17:13.918 "name": "BaseBdev1", 00:17:13.918 "aliases": [ 00:17:13.918 "eefc3b8a-b375-48e9-a856-6b4dc939f87d" 00:17:13.918 ], 00:17:13.918 "product_name": "Malloc disk", 00:17:13.918 "block_size": 512, 00:17:13.918 "num_blocks": 65536, 00:17:13.918 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:13.918 "assigned_rate_limits": { 00:17:13.918 "rw_ios_per_sec": 0, 00:17:13.918 "rw_mbytes_per_sec": 0, 00:17:13.918 "r_mbytes_per_sec": 0, 00:17:13.918 "w_mbytes_per_sec": 0 00:17:13.918 }, 00:17:13.918 "claimed": true, 00:17:13.918 "claim_type": "exclusive_write", 00:17:13.918 "zoned": false, 00:17:13.918 "supported_io_types": { 00:17:13.918 "read": true, 00:17:13.918 "write": true, 00:17:13.918 "unmap": true, 00:17:13.918 "write_zeroes": true, 00:17:13.918 "flush": true, 00:17:13.918 "reset": true, 00:17:13.918 "compare": false, 00:17:13.918 "compare_and_write": false, 00:17:13.918 "abort": true, 00:17:13.918 "nvme_admin": false, 00:17:13.918 "nvme_io": false 00:17:13.918 }, 00:17:13.918 "memory_domains": [ 00:17:13.918 { 00:17:13.918 "dma_device_id": "system", 00:17:13.918 "dma_device_type": 1 00:17:13.918 }, 00:17:13.918 { 00:17:13.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.918 "dma_device_type": 2 00:17:13.918 } 00:17:13.918 ], 00:17:13.918 "driver_specific": {} 00:17:13.918 } 00:17:13.918 ] 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.178 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.178 "name": "Existed_Raid", 00:17:14.178 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:14.178 "strip_size_kb": 64, 00:17:14.178 "state": "configuring", 00:17:14.178 "raid_level": "concat", 00:17:14.178 "superblock": true, 00:17:14.178 "num_base_bdevs": 4, 00:17:14.178 "num_base_bdevs_discovered": 3, 00:17:14.178 "num_base_bdevs_operational": 4, 00:17:14.178 "base_bdevs_list": [ 00:17:14.178 { 00:17:14.178 "name": "BaseBdev1", 00:17:14.178 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:14.178 "is_configured": true, 00:17:14.178 "data_offset": 2048, 00:17:14.178 "data_size": 63488 00:17:14.178 }, 00:17:14.178 { 00:17:14.178 "name": null, 00:17:14.178 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:14.178 "is_configured": false, 00:17:14.178 "data_offset": 2048, 00:17:14.178 "data_size": 63488 00:17:14.178 }, 00:17:14.178 { 00:17:14.179 "name": "BaseBdev3", 00:17:14.179 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:14.179 "is_configured": true, 00:17:14.179 "data_offset": 2048, 00:17:14.179 "data_size": 63488 00:17:14.179 }, 00:17:14.179 { 00:17:14.179 "name": "BaseBdev4", 00:17:14.179 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:14.179 "is_configured": true, 00:17:14.179 "data_offset": 2048, 00:17:14.179 "data_size": 63488 00:17:14.179 } 00:17:14.179 ] 00:17:14.179 }' 00:17:14.179 13:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.179 13:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:14.748 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.748 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:15.008 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:15.008 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:15.268 [2024-06-10 13:45:29.562159] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.268 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.528 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.528 "name": "Existed_Raid", 00:17:15.528 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:15.528 "strip_size_kb": 64, 00:17:15.528 "state": "configuring", 00:17:15.528 "raid_level": "concat", 00:17:15.528 "superblock": true, 00:17:15.528 "num_base_bdevs": 4, 00:17:15.528 "num_base_bdevs_discovered": 2, 00:17:15.528 "num_base_bdevs_operational": 4, 00:17:15.528 "base_bdevs_list": [ 00:17:15.528 { 00:17:15.528 "name": "BaseBdev1", 00:17:15.528 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:15.528 "is_configured": true, 00:17:15.528 "data_offset": 2048, 00:17:15.528 "data_size": 63488 00:17:15.528 }, 00:17:15.528 { 00:17:15.528 "name": null, 00:17:15.528 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:15.528 "is_configured": false, 00:17:15.528 "data_offset": 2048, 00:17:15.528 "data_size": 63488 00:17:15.528 }, 00:17:15.528 { 00:17:15.528 "name": null, 00:17:15.528 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:15.528 "is_configured": false, 00:17:15.528 "data_offset": 2048, 00:17:15.528 "data_size": 63488 00:17:15.528 }, 00:17:15.528 { 00:17:15.528 "name": "BaseBdev4", 00:17:15.528 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:15.528 "is_configured": true, 00:17:15.528 "data_offset": 2048, 00:17:15.528 "data_size": 63488 00:17:15.528 } 00:17:15.528 ] 00:17:15.528 }' 00:17:15.528 13:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.528 13:45:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.098 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.098 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:16.098 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:16.098 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:16.358 [2024-06-10 13:45:30.725116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.358 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:16.618 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.618 "name": "Existed_Raid", 00:17:16.618 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:16.618 "strip_size_kb": 64, 00:17:16.618 "state": "configuring", 00:17:16.618 "raid_level": "concat", 00:17:16.618 "superblock": true, 00:17:16.618 "num_base_bdevs": 4, 00:17:16.618 "num_base_bdevs_discovered": 3, 00:17:16.618 "num_base_bdevs_operational": 4, 00:17:16.618 "base_bdevs_list": [ 00:17:16.618 { 00:17:16.618 "name": "BaseBdev1", 00:17:16.618 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:16.618 "is_configured": true, 00:17:16.618 "data_offset": 2048, 00:17:16.618 "data_size": 63488 00:17:16.618 }, 00:17:16.618 { 00:17:16.618 "name": null, 00:17:16.618 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:16.618 "is_configured": false, 00:17:16.618 "data_offset": 2048, 00:17:16.618 "data_size": 63488 00:17:16.618 }, 00:17:16.619 { 00:17:16.619 "name": "BaseBdev3", 00:17:16.619 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:16.619 "is_configured": true, 00:17:16.619 "data_offset": 2048, 00:17:16.619 "data_size": 63488 00:17:16.619 }, 00:17:16.619 { 00:17:16.619 "name": "BaseBdev4", 00:17:16.619 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:16.619 "is_configured": true, 00:17:16.619 "data_offset": 2048, 00:17:16.619 "data_size": 63488 00:17:16.619 } 00:17:16.619 ] 00:17:16.619 }' 00:17:16.619 13:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.619 13:45:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:17.188 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.188 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:17.449 [2024-06-10 13:45:31.860008] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.449 13:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.710 13:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.710 "name": "Existed_Raid", 00:17:17.710 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:17.710 "strip_size_kb": 64, 00:17:17.710 "state": "configuring", 00:17:17.710 "raid_level": "concat", 00:17:17.710 "superblock": true, 00:17:17.710 "num_base_bdevs": 4, 00:17:17.710 "num_base_bdevs_discovered": 2, 00:17:17.710 "num_base_bdevs_operational": 4, 00:17:17.710 "base_bdevs_list": [ 00:17:17.711 { 00:17:17.711 "name": null, 00:17:17.711 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:17.711 "is_configured": false, 00:17:17.711 "data_offset": 2048, 00:17:17.711 "data_size": 63488 00:17:17.711 }, 00:17:17.711 { 00:17:17.711 "name": null, 00:17:17.711 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:17.711 "is_configured": false, 00:17:17.711 "data_offset": 2048, 00:17:17.711 "data_size": 63488 00:17:17.711 }, 00:17:17.711 { 00:17:17.711 "name": "BaseBdev3", 00:17:17.711 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:17.711 "is_configured": true, 00:17:17.711 "data_offset": 2048, 00:17:17.711 "data_size": 63488 00:17:17.711 }, 00:17:17.711 { 00:17:17.711 "name": "BaseBdev4", 00:17:17.711 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:17.711 "is_configured": true, 00:17:17.711 "data_offset": 2048, 00:17:17.711 "data_size": 63488 00:17:17.711 } 00:17:17.711 ] 00:17:17.711 }' 00:17:17.711 13:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.711 13:45:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:18.281 13:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.281 13:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:18.541 13:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:18.541 13:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:18.801 [2024-06-10 13:45:33.020918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.801 "name": "Existed_Raid", 00:17:18.801 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:18.801 "strip_size_kb": 64, 00:17:18.801 "state": "configuring", 00:17:18.801 "raid_level": "concat", 00:17:18.801 "superblock": true, 00:17:18.801 "num_base_bdevs": 4, 00:17:18.801 "num_base_bdevs_discovered": 3, 00:17:18.801 "num_base_bdevs_operational": 4, 00:17:18.801 "base_bdevs_list": [ 00:17:18.801 { 00:17:18.801 "name": null, 00:17:18.801 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:18.801 "is_configured": false, 00:17:18.801 "data_offset": 2048, 00:17:18.801 "data_size": 63488 00:17:18.801 }, 00:17:18.801 { 00:17:18.801 "name": "BaseBdev2", 00:17:18.801 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:18.801 "is_configured": true, 00:17:18.801 "data_offset": 2048, 00:17:18.801 "data_size": 63488 00:17:18.801 }, 00:17:18.801 { 00:17:18.801 "name": "BaseBdev3", 00:17:18.801 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:18.801 "is_configured": true, 00:17:18.801 "data_offset": 2048, 00:17:18.801 "data_size": 63488 00:17:18.801 }, 00:17:18.801 { 00:17:18.801 "name": "BaseBdev4", 00:17:18.801 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:18.801 "is_configured": true, 00:17:18.801 "data_offset": 2048, 00:17:18.801 "data_size": 63488 00:17:18.801 } 00:17:18.801 ] 00:17:18.801 }' 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.801 13:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.451 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.451 13:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:19.716 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:19.716 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:19.716 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u eefc3b8a-b375-48e9-a856-6b4dc939f87d 00:17:19.976 [2024-06-10 13:45:34.405530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:19.976 [2024-06-10 13:45:34.405644] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24be8f0 00:17:19.976 [2024-06-10 13:45:34.405652] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:19.976 [2024-06-10 13:45:34.405801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24bc480 00:17:19.976 [2024-06-10 13:45:34.405894] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24be8f0 00:17:19.976 [2024-06-10 13:45:34.405900] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24be8f0 00:17:19.976 [2024-06-10 13:45:34.405971] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.976 NewBaseBdev 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:19.976 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.236 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:20.496 [ 00:17:20.496 { 00:17:20.496 "name": "NewBaseBdev", 00:17:20.496 "aliases": [ 00:17:20.496 "eefc3b8a-b375-48e9-a856-6b4dc939f87d" 00:17:20.496 ], 00:17:20.496 "product_name": "Malloc disk", 00:17:20.496 "block_size": 512, 00:17:20.496 "num_blocks": 65536, 00:17:20.496 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:20.496 "assigned_rate_limits": { 00:17:20.496 "rw_ios_per_sec": 0, 00:17:20.496 "rw_mbytes_per_sec": 0, 00:17:20.496 "r_mbytes_per_sec": 0, 00:17:20.496 "w_mbytes_per_sec": 0 00:17:20.496 }, 00:17:20.496 "claimed": true, 00:17:20.496 "claim_type": "exclusive_write", 00:17:20.496 "zoned": false, 00:17:20.496 "supported_io_types": { 00:17:20.496 "read": true, 00:17:20.496 "write": true, 00:17:20.496 "unmap": true, 00:17:20.496 "write_zeroes": true, 00:17:20.496 "flush": true, 00:17:20.496 "reset": true, 00:17:20.496 "compare": false, 00:17:20.496 "compare_and_write": false, 00:17:20.496 "abort": true, 00:17:20.496 "nvme_admin": false, 00:17:20.496 "nvme_io": false 00:17:20.496 }, 00:17:20.496 "memory_domains": [ 00:17:20.496 { 00:17:20.496 "dma_device_id": "system", 00:17:20.496 "dma_device_type": 1 00:17:20.496 }, 00:17:20.496 { 00:17:20.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.496 "dma_device_type": 2 00:17:20.496 } 00:17:20.496 ], 00:17:20.496 "driver_specific": {} 00:17:20.496 } 00:17:20.496 ] 00:17:20.496 13:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:20.496 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:17:20.496 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.497 13:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.757 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.757 "name": "Existed_Raid", 00:17:20.757 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:20.757 "strip_size_kb": 64, 00:17:20.757 "state": "online", 00:17:20.757 "raid_level": "concat", 00:17:20.757 "superblock": true, 00:17:20.757 "num_base_bdevs": 4, 00:17:20.757 "num_base_bdevs_discovered": 4, 00:17:20.757 "num_base_bdevs_operational": 4, 00:17:20.757 "base_bdevs_list": [ 00:17:20.757 { 00:17:20.757 "name": "NewBaseBdev", 00:17:20.757 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:20.757 "is_configured": true, 00:17:20.757 "data_offset": 2048, 00:17:20.757 "data_size": 63488 00:17:20.757 }, 00:17:20.757 { 00:17:20.757 "name": "BaseBdev2", 00:17:20.757 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:20.757 "is_configured": true, 00:17:20.757 "data_offset": 2048, 00:17:20.757 "data_size": 63488 00:17:20.757 }, 00:17:20.757 { 00:17:20.757 "name": "BaseBdev3", 00:17:20.757 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:20.757 "is_configured": true, 00:17:20.757 "data_offset": 2048, 00:17:20.757 "data_size": 63488 00:17:20.757 }, 00:17:20.757 { 00:17:20.757 "name": "BaseBdev4", 00:17:20.757 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:20.757 "is_configured": true, 00:17:20.757 "data_offset": 2048, 00:17:20.757 "data_size": 63488 00:17:20.757 } 00:17:20.757 ] 00:17:20.757 }' 00:17:20.757 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.757 13:45:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:21.327 [2024-06-10 13:45:35.749199] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:21.327 "name": "Existed_Raid", 00:17:21.327 "aliases": [ 00:17:21.327 "b99c8805-3a25-43e3-b4de-5edd02cf3e91" 00:17:21.327 ], 00:17:21.327 "product_name": "Raid Volume", 00:17:21.327 "block_size": 512, 00:17:21.327 "num_blocks": 253952, 00:17:21.327 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:21.327 "assigned_rate_limits": { 00:17:21.327 "rw_ios_per_sec": 0, 00:17:21.327 "rw_mbytes_per_sec": 0, 00:17:21.327 "r_mbytes_per_sec": 0, 00:17:21.327 "w_mbytes_per_sec": 0 00:17:21.327 }, 00:17:21.327 "claimed": false, 00:17:21.327 "zoned": false, 00:17:21.327 "supported_io_types": { 00:17:21.327 "read": true, 00:17:21.327 "write": true, 00:17:21.327 "unmap": true, 00:17:21.327 "write_zeroes": true, 00:17:21.327 "flush": true, 00:17:21.327 "reset": true, 00:17:21.327 "compare": false, 00:17:21.327 "compare_and_write": false, 00:17:21.327 "abort": false, 00:17:21.327 "nvme_admin": false, 00:17:21.327 "nvme_io": false 00:17:21.327 }, 00:17:21.327 "memory_domains": [ 00:17:21.327 { 00:17:21.327 "dma_device_id": "system", 00:17:21.327 "dma_device_type": 1 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.327 "dma_device_type": 2 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "system", 00:17:21.327 "dma_device_type": 1 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.327 "dma_device_type": 2 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "system", 00:17:21.327 "dma_device_type": 1 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.327 "dma_device_type": 2 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "system", 00:17:21.327 "dma_device_type": 1 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.327 "dma_device_type": 2 00:17:21.327 } 00:17:21.327 ], 00:17:21.327 "driver_specific": { 00:17:21.327 "raid": { 00:17:21.327 "uuid": "b99c8805-3a25-43e3-b4de-5edd02cf3e91", 00:17:21.327 "strip_size_kb": 64, 00:17:21.327 "state": "online", 00:17:21.327 "raid_level": "concat", 00:17:21.327 "superblock": true, 00:17:21.327 "num_base_bdevs": 4, 00:17:21.327 "num_base_bdevs_discovered": 4, 00:17:21.327 "num_base_bdevs_operational": 4, 00:17:21.327 "base_bdevs_list": [ 00:17:21.327 { 00:17:21.327 "name": "NewBaseBdev", 00:17:21.327 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:21.327 "is_configured": true, 00:17:21.327 "data_offset": 2048, 00:17:21.327 "data_size": 63488 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "name": "BaseBdev2", 00:17:21.327 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:21.327 "is_configured": true, 00:17:21.327 "data_offset": 2048, 00:17:21.327 "data_size": 63488 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "name": "BaseBdev3", 00:17:21.327 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:21.327 "is_configured": true, 00:17:21.327 "data_offset": 2048, 00:17:21.327 "data_size": 63488 00:17:21.327 }, 00:17:21.327 { 00:17:21.327 "name": "BaseBdev4", 00:17:21.327 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:21.327 "is_configured": true, 00:17:21.327 "data_offset": 2048, 00:17:21.327 "data_size": 63488 00:17:21.327 } 00:17:21.327 ] 00:17:21.327 } 00:17:21.327 } 00:17:21.327 }' 00:17:21.327 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:21.587 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:21.587 BaseBdev2 00:17:21.587 BaseBdev3 00:17:21.587 BaseBdev4' 00:17:21.587 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.587 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:21.587 13:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.587 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.587 "name": "NewBaseBdev", 00:17:21.587 "aliases": [ 00:17:21.587 "eefc3b8a-b375-48e9-a856-6b4dc939f87d" 00:17:21.587 ], 00:17:21.587 "product_name": "Malloc disk", 00:17:21.587 "block_size": 512, 00:17:21.587 "num_blocks": 65536, 00:17:21.587 "uuid": "eefc3b8a-b375-48e9-a856-6b4dc939f87d", 00:17:21.587 "assigned_rate_limits": { 00:17:21.587 "rw_ios_per_sec": 0, 00:17:21.587 "rw_mbytes_per_sec": 0, 00:17:21.587 "r_mbytes_per_sec": 0, 00:17:21.587 "w_mbytes_per_sec": 0 00:17:21.587 }, 00:17:21.587 "claimed": true, 00:17:21.587 "claim_type": "exclusive_write", 00:17:21.587 "zoned": false, 00:17:21.587 "supported_io_types": { 00:17:21.587 "read": true, 00:17:21.587 "write": true, 00:17:21.587 "unmap": true, 00:17:21.587 "write_zeroes": true, 00:17:21.587 "flush": true, 00:17:21.587 "reset": true, 00:17:21.587 "compare": false, 00:17:21.587 "compare_and_write": false, 00:17:21.587 "abort": true, 00:17:21.587 "nvme_admin": false, 00:17:21.587 "nvme_io": false 00:17:21.587 }, 00:17:21.587 "memory_domains": [ 00:17:21.587 { 00:17:21.587 "dma_device_id": "system", 00:17:21.587 "dma_device_type": 1 00:17:21.587 }, 00:17:21.587 { 00:17:21.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.587 "dma_device_type": 2 00:17:21.587 } 00:17:21.587 ], 00:17:21.587 "driver_specific": {} 00:17:21.587 }' 00:17:21.587 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.847 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.107 "name": "BaseBdev2", 00:17:22.107 "aliases": [ 00:17:22.107 "e5fa943c-7cec-4a9a-a422-929a6baa2943" 00:17:22.107 ], 00:17:22.107 "product_name": "Malloc disk", 00:17:22.107 "block_size": 512, 00:17:22.107 "num_blocks": 65536, 00:17:22.107 "uuid": "e5fa943c-7cec-4a9a-a422-929a6baa2943", 00:17:22.107 "assigned_rate_limits": { 00:17:22.107 "rw_ios_per_sec": 0, 00:17:22.107 "rw_mbytes_per_sec": 0, 00:17:22.107 "r_mbytes_per_sec": 0, 00:17:22.107 "w_mbytes_per_sec": 0 00:17:22.107 }, 00:17:22.107 "claimed": true, 00:17:22.107 "claim_type": "exclusive_write", 00:17:22.107 "zoned": false, 00:17:22.107 "supported_io_types": { 00:17:22.107 "read": true, 00:17:22.107 "write": true, 00:17:22.107 "unmap": true, 00:17:22.107 "write_zeroes": true, 00:17:22.107 "flush": true, 00:17:22.107 "reset": true, 00:17:22.107 "compare": false, 00:17:22.107 "compare_and_write": false, 00:17:22.107 "abort": true, 00:17:22.107 "nvme_admin": false, 00:17:22.107 "nvme_io": false 00:17:22.107 }, 00:17:22.107 "memory_domains": [ 00:17:22.107 { 00:17:22.107 "dma_device_id": "system", 00:17:22.107 "dma_device_type": 1 00:17:22.107 }, 00:17:22.107 { 00:17:22.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.107 "dma_device_type": 2 00:17:22.107 } 00:17:22.107 ], 00:17:22.107 "driver_specific": {} 00:17:22.107 }' 00:17:22.107 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.367 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.627 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.627 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.627 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.627 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:22.627 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:22.627 13:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.887 "name": "BaseBdev3", 00:17:22.887 "aliases": [ 00:17:22.887 "5a16f325-29e3-4738-a5fd-9d5e117bef7e" 00:17:22.887 ], 00:17:22.887 "product_name": "Malloc disk", 00:17:22.887 "block_size": 512, 00:17:22.887 "num_blocks": 65536, 00:17:22.887 "uuid": "5a16f325-29e3-4738-a5fd-9d5e117bef7e", 00:17:22.887 "assigned_rate_limits": { 00:17:22.887 "rw_ios_per_sec": 0, 00:17:22.887 "rw_mbytes_per_sec": 0, 00:17:22.887 "r_mbytes_per_sec": 0, 00:17:22.887 "w_mbytes_per_sec": 0 00:17:22.887 }, 00:17:22.887 "claimed": true, 00:17:22.887 "claim_type": "exclusive_write", 00:17:22.887 "zoned": false, 00:17:22.887 "supported_io_types": { 00:17:22.887 "read": true, 00:17:22.887 "write": true, 00:17:22.887 "unmap": true, 00:17:22.887 "write_zeroes": true, 00:17:22.887 "flush": true, 00:17:22.887 "reset": true, 00:17:22.887 "compare": false, 00:17:22.887 "compare_and_write": false, 00:17:22.887 "abort": true, 00:17:22.887 "nvme_admin": false, 00:17:22.887 "nvme_io": false 00:17:22.887 }, 00:17:22.887 "memory_domains": [ 00:17:22.887 { 00:17:22.887 "dma_device_id": "system", 00:17:22.887 "dma_device_type": 1 00:17:22.887 }, 00:17:22.887 { 00:17:22.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.887 "dma_device_type": 2 00:17:22.887 } 00:17:22.887 ], 00:17:22.887 "driver_specific": {} 00:17:22.887 }' 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.887 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:23.146 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:23.406 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:23.406 "name": "BaseBdev4", 00:17:23.406 "aliases": [ 00:17:23.406 "eba3155f-3e21-45fa-a193-8bbaeca22740" 00:17:23.406 ], 00:17:23.406 "product_name": "Malloc disk", 00:17:23.406 "block_size": 512, 00:17:23.406 "num_blocks": 65536, 00:17:23.406 "uuid": "eba3155f-3e21-45fa-a193-8bbaeca22740", 00:17:23.406 "assigned_rate_limits": { 00:17:23.406 "rw_ios_per_sec": 0, 00:17:23.406 "rw_mbytes_per_sec": 0, 00:17:23.406 "r_mbytes_per_sec": 0, 00:17:23.406 "w_mbytes_per_sec": 0 00:17:23.406 }, 00:17:23.406 "claimed": true, 00:17:23.406 "claim_type": "exclusive_write", 00:17:23.406 "zoned": false, 00:17:23.406 "supported_io_types": { 00:17:23.406 "read": true, 00:17:23.406 "write": true, 00:17:23.406 "unmap": true, 00:17:23.406 "write_zeroes": true, 00:17:23.406 "flush": true, 00:17:23.406 "reset": true, 00:17:23.406 "compare": false, 00:17:23.406 "compare_and_write": false, 00:17:23.406 "abort": true, 00:17:23.406 "nvme_admin": false, 00:17:23.406 "nvme_io": false 00:17:23.406 }, 00:17:23.406 "memory_domains": [ 00:17:23.406 { 00:17:23.406 "dma_device_id": "system", 00:17:23.406 "dma_device_type": 1 00:17:23.406 }, 00:17:23.406 { 00:17:23.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.406 "dma_device_type": 2 00:17:23.406 } 00:17:23.406 ], 00:17:23.406 "driver_specific": {} 00:17:23.406 }' 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:23.407 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.666 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.666 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:23.666 13:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:23.666 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:23.666 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:23.666 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:23.927 [2024-06-10 13:45:38.239275] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:23.927 [2024-06-10 13:45:38.239293] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:23.927 [2024-06-10 13:45:38.239335] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:23.927 [2024-06-10 13:45:38.239383] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:23.927 [2024-06-10 13:45:38.239390] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24be8f0 name Existed_Raid, state offline 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1587913 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1587913 ']' 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1587913 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1587913 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1587913' 00:17:23.927 killing process with pid 1587913 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1587913 00:17:23.927 [2024-06-10 13:45:38.308343] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:23.927 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1587913 00:17:23.927 [2024-06-10 13:45:38.329715] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:24.189 13:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:24.189 00:17:24.189 real 0m28.259s 00:17:24.189 user 0m53.027s 00:17:24.189 sys 0m4.153s 00:17:24.189 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:24.189 13:45:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.189 ************************************ 00:17:24.189 END TEST raid_state_function_test_sb 00:17:24.189 ************************************ 00:17:24.189 13:45:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:17:24.189 13:45:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:17:24.189 13:45:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:24.189 13:45:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:24.189 ************************************ 00:17:24.189 START TEST raid_superblock_test 00:17:24.189 ************************************ 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 4 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1594123 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1594123 /var/tmp/spdk-raid.sock 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1594123 ']' 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:24.189 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:24.189 13:45:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.189 [2024-06-10 13:45:38.590064] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:17:24.189 [2024-06-10 13:45:38.590116] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594123 ] 00:17:24.450 [2024-06-10 13:45:38.681577] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:24.450 [2024-06-10 13:45:38.750056] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:17:24.450 [2024-06-10 13:45:38.806594] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:24.450 [2024-06-10 13:45:38.806621] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:25.020 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:25.280 malloc1 00:17:25.280 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:25.540 [2024-06-10 13:45:39.826507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:25.540 [2024-06-10 13:45:39.826541] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.540 [2024-06-10 13:45:39.826553] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x109d550 00:17:25.540 [2024-06-10 13:45:39.826560] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.540 [2024-06-10 13:45:39.827874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.540 [2024-06-10 13:45:39.827895] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:25.540 pt1 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:25.540 13:45:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:25.800 malloc2 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:25.800 [2024-06-10 13:45:40.237805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:25.800 [2024-06-10 13:45:40.237838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.800 [2024-06-10 13:45:40.237849] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x115f0f0 00:17:25.800 [2024-06-10 13:45:40.237856] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.800 [2024-06-10 13:45:40.239138] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.800 [2024-06-10 13:45:40.239157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:25.800 pt2 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:25.800 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:26.060 malloc3 00:17:26.060 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:26.320 [2024-06-10 13:45:40.664920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:26.320 [2024-06-10 13:45:40.664950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.320 [2024-06-10 13:45:40.664961] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11605b0 00:17:26.320 [2024-06-10 13:45:40.664969] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.320 [2024-06-10 13:45:40.666241] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.320 [2024-06-10 13:45:40.666260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:26.320 pt3 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:26.320 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:26.580 malloc4 00:17:26.580 13:45:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:26.840 [2024-06-10 13:45:41.076019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:26.840 [2024-06-10 13:45:41.076047] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.840 [2024-06-10 13:45:41.076057] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1160d90 00:17:26.840 [2024-06-10 13:45:41.076064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.840 [2024-06-10 13:45:41.077307] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.840 [2024-06-10 13:45:41.077327] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:26.840 pt4 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:26.840 [2024-06-10 13:45:41.276550] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:26.840 [2024-06-10 13:45:41.277609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:26.840 [2024-06-10 13:45:41.277653] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:26.840 [2024-06-10 13:45:41.277689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:26.840 [2024-06-10 13:45:41.277830] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1096b60 00:17:26.840 [2024-06-10 13:45:41.277837] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:26.840 [2024-06-10 13:45:41.277995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1169920 00:17:26.840 [2024-06-10 13:45:41.278111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1096b60 00:17:26.840 [2024-06-10 13:45:41.278117] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1096b60 00:17:26.840 [2024-06-10 13:45:41.278197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.840 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:27.100 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.100 "name": "raid_bdev1", 00:17:27.100 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:27.100 "strip_size_kb": 64, 00:17:27.100 "state": "online", 00:17:27.100 "raid_level": "concat", 00:17:27.100 "superblock": true, 00:17:27.100 "num_base_bdevs": 4, 00:17:27.100 "num_base_bdevs_discovered": 4, 00:17:27.100 "num_base_bdevs_operational": 4, 00:17:27.100 "base_bdevs_list": [ 00:17:27.100 { 00:17:27.100 "name": "pt1", 00:17:27.100 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:27.100 "is_configured": true, 00:17:27.100 "data_offset": 2048, 00:17:27.100 "data_size": 63488 00:17:27.100 }, 00:17:27.100 { 00:17:27.100 "name": "pt2", 00:17:27.100 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:27.100 "is_configured": true, 00:17:27.100 "data_offset": 2048, 00:17:27.100 "data_size": 63488 00:17:27.100 }, 00:17:27.100 { 00:17:27.100 "name": "pt3", 00:17:27.100 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:27.100 "is_configured": true, 00:17:27.100 "data_offset": 2048, 00:17:27.100 "data_size": 63488 00:17:27.100 }, 00:17:27.100 { 00:17:27.100 "name": "pt4", 00:17:27.100 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:27.100 "is_configured": true, 00:17:27.100 "data_offset": 2048, 00:17:27.100 "data_size": 63488 00:17:27.100 } 00:17:27.100 ] 00:17:27.100 }' 00:17:27.100 13:45:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.100 13:45:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:27.671 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:27.931 [2024-06-10 13:45:42.219168] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:27.931 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:27.931 "name": "raid_bdev1", 00:17:27.931 "aliases": [ 00:17:27.931 "863dbc7e-10d0-4e43-94ec-fe880ef91a54" 00:17:27.931 ], 00:17:27.931 "product_name": "Raid Volume", 00:17:27.931 "block_size": 512, 00:17:27.931 "num_blocks": 253952, 00:17:27.931 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:27.931 "assigned_rate_limits": { 00:17:27.931 "rw_ios_per_sec": 0, 00:17:27.931 "rw_mbytes_per_sec": 0, 00:17:27.931 "r_mbytes_per_sec": 0, 00:17:27.931 "w_mbytes_per_sec": 0 00:17:27.931 }, 00:17:27.931 "claimed": false, 00:17:27.931 "zoned": false, 00:17:27.931 "supported_io_types": { 00:17:27.931 "read": true, 00:17:27.931 "write": true, 00:17:27.931 "unmap": true, 00:17:27.931 "write_zeroes": true, 00:17:27.932 "flush": true, 00:17:27.932 "reset": true, 00:17:27.932 "compare": false, 00:17:27.932 "compare_and_write": false, 00:17:27.932 "abort": false, 00:17:27.932 "nvme_admin": false, 00:17:27.932 "nvme_io": false 00:17:27.932 }, 00:17:27.932 "memory_domains": [ 00:17:27.932 { 00:17:27.932 "dma_device_id": "system", 00:17:27.932 "dma_device_type": 1 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.932 "dma_device_type": 2 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "system", 00:17:27.932 "dma_device_type": 1 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.932 "dma_device_type": 2 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "system", 00:17:27.932 "dma_device_type": 1 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.932 "dma_device_type": 2 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "system", 00:17:27.932 "dma_device_type": 1 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.932 "dma_device_type": 2 00:17:27.932 } 00:17:27.932 ], 00:17:27.932 "driver_specific": { 00:17:27.932 "raid": { 00:17:27.932 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:27.932 "strip_size_kb": 64, 00:17:27.932 "state": "online", 00:17:27.932 "raid_level": "concat", 00:17:27.932 "superblock": true, 00:17:27.932 "num_base_bdevs": 4, 00:17:27.932 "num_base_bdevs_discovered": 4, 00:17:27.932 "num_base_bdevs_operational": 4, 00:17:27.932 "base_bdevs_list": [ 00:17:27.932 { 00:17:27.932 "name": "pt1", 00:17:27.932 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:27.932 "is_configured": true, 00:17:27.932 "data_offset": 2048, 00:17:27.932 "data_size": 63488 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "name": "pt2", 00:17:27.932 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:27.932 "is_configured": true, 00:17:27.932 "data_offset": 2048, 00:17:27.932 "data_size": 63488 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "name": "pt3", 00:17:27.932 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:27.932 "is_configured": true, 00:17:27.932 "data_offset": 2048, 00:17:27.932 "data_size": 63488 00:17:27.932 }, 00:17:27.932 { 00:17:27.932 "name": "pt4", 00:17:27.932 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:27.932 "is_configured": true, 00:17:27.932 "data_offset": 2048, 00:17:27.932 "data_size": 63488 00:17:27.932 } 00:17:27.932 ] 00:17:27.932 } 00:17:27.932 } 00:17:27.932 }' 00:17:27.932 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:27.932 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:27.932 pt2 00:17:27.932 pt3 00:17:27.932 pt4' 00:17:27.932 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:27.932 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:27.932 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:28.192 "name": "pt1", 00:17:28.192 "aliases": [ 00:17:28.192 "00000000-0000-0000-0000-000000000001" 00:17:28.192 ], 00:17:28.192 "product_name": "passthru", 00:17:28.192 "block_size": 512, 00:17:28.192 "num_blocks": 65536, 00:17:28.192 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:28.192 "assigned_rate_limits": { 00:17:28.192 "rw_ios_per_sec": 0, 00:17:28.192 "rw_mbytes_per_sec": 0, 00:17:28.192 "r_mbytes_per_sec": 0, 00:17:28.192 "w_mbytes_per_sec": 0 00:17:28.192 }, 00:17:28.192 "claimed": true, 00:17:28.192 "claim_type": "exclusive_write", 00:17:28.192 "zoned": false, 00:17:28.192 "supported_io_types": { 00:17:28.192 "read": true, 00:17:28.192 "write": true, 00:17:28.192 "unmap": true, 00:17:28.192 "write_zeroes": true, 00:17:28.192 "flush": true, 00:17:28.192 "reset": true, 00:17:28.192 "compare": false, 00:17:28.192 "compare_and_write": false, 00:17:28.192 "abort": true, 00:17:28.192 "nvme_admin": false, 00:17:28.192 "nvme_io": false 00:17:28.192 }, 00:17:28.192 "memory_domains": [ 00:17:28.192 { 00:17:28.192 "dma_device_id": "system", 00:17:28.192 "dma_device_type": 1 00:17:28.192 }, 00:17:28.192 { 00:17:28.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.192 "dma_device_type": 2 00:17:28.192 } 00:17:28.192 ], 00:17:28.192 "driver_specific": { 00:17:28.192 "passthru": { 00:17:28.192 "name": "pt1", 00:17:28.192 "base_bdev_name": "malloc1" 00:17:28.192 } 00:17:28.192 } 00:17:28.192 }' 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:28.192 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:28.452 13:45:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:28.712 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:28.712 "name": "pt2", 00:17:28.712 "aliases": [ 00:17:28.712 "00000000-0000-0000-0000-000000000002" 00:17:28.712 ], 00:17:28.712 "product_name": "passthru", 00:17:28.712 "block_size": 512, 00:17:28.712 "num_blocks": 65536, 00:17:28.712 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:28.712 "assigned_rate_limits": { 00:17:28.712 "rw_ios_per_sec": 0, 00:17:28.712 "rw_mbytes_per_sec": 0, 00:17:28.712 "r_mbytes_per_sec": 0, 00:17:28.712 "w_mbytes_per_sec": 0 00:17:28.712 }, 00:17:28.712 "claimed": true, 00:17:28.712 "claim_type": "exclusive_write", 00:17:28.712 "zoned": false, 00:17:28.712 "supported_io_types": { 00:17:28.712 "read": true, 00:17:28.712 "write": true, 00:17:28.712 "unmap": true, 00:17:28.712 "write_zeroes": true, 00:17:28.712 "flush": true, 00:17:28.712 "reset": true, 00:17:28.712 "compare": false, 00:17:28.712 "compare_and_write": false, 00:17:28.712 "abort": true, 00:17:28.712 "nvme_admin": false, 00:17:28.712 "nvme_io": false 00:17:28.712 }, 00:17:28.712 "memory_domains": [ 00:17:28.712 { 00:17:28.712 "dma_device_id": "system", 00:17:28.712 "dma_device_type": 1 00:17:28.712 }, 00:17:28.712 { 00:17:28.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.712 "dma_device_type": 2 00:17:28.712 } 00:17:28.712 ], 00:17:28.712 "driver_specific": { 00:17:28.712 "passthru": { 00:17:28.712 "name": "pt2", 00:17:28.712 "base_bdev_name": "malloc2" 00:17:28.712 } 00:17:28.712 } 00:17:28.712 }' 00:17:28.712 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.712 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.712 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:28.712 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.712 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:28.971 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.231 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.231 "name": "pt3", 00:17:29.231 "aliases": [ 00:17:29.231 "00000000-0000-0000-0000-000000000003" 00:17:29.231 ], 00:17:29.231 "product_name": "passthru", 00:17:29.231 "block_size": 512, 00:17:29.231 "num_blocks": 65536, 00:17:29.231 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:29.231 "assigned_rate_limits": { 00:17:29.231 "rw_ios_per_sec": 0, 00:17:29.231 "rw_mbytes_per_sec": 0, 00:17:29.231 "r_mbytes_per_sec": 0, 00:17:29.231 "w_mbytes_per_sec": 0 00:17:29.231 }, 00:17:29.231 "claimed": true, 00:17:29.231 "claim_type": "exclusive_write", 00:17:29.231 "zoned": false, 00:17:29.231 "supported_io_types": { 00:17:29.231 "read": true, 00:17:29.231 "write": true, 00:17:29.231 "unmap": true, 00:17:29.231 "write_zeroes": true, 00:17:29.231 "flush": true, 00:17:29.231 "reset": true, 00:17:29.231 "compare": false, 00:17:29.231 "compare_and_write": false, 00:17:29.231 "abort": true, 00:17:29.231 "nvme_admin": false, 00:17:29.231 "nvme_io": false 00:17:29.231 }, 00:17:29.231 "memory_domains": [ 00:17:29.231 { 00:17:29.231 "dma_device_id": "system", 00:17:29.231 "dma_device_type": 1 00:17:29.231 }, 00:17:29.231 { 00:17:29.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.231 "dma_device_type": 2 00:17:29.231 } 00:17:29.231 ], 00:17:29.231 "driver_specific": { 00:17:29.231 "passthru": { 00:17:29.231 "name": "pt3", 00:17:29.231 "base_bdev_name": "malloc3" 00:17:29.231 } 00:17:29.231 } 00:17:29.231 }' 00:17:29.231 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.231 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.231 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.231 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.491 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.751 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:29.751 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.751 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:29.751 13:45:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.751 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.751 "name": "pt4", 00:17:29.751 "aliases": [ 00:17:29.751 "00000000-0000-0000-0000-000000000004" 00:17:29.751 ], 00:17:29.751 "product_name": "passthru", 00:17:29.751 "block_size": 512, 00:17:29.751 "num_blocks": 65536, 00:17:29.751 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:29.751 "assigned_rate_limits": { 00:17:29.751 "rw_ios_per_sec": 0, 00:17:29.751 "rw_mbytes_per_sec": 0, 00:17:29.751 "r_mbytes_per_sec": 0, 00:17:29.751 "w_mbytes_per_sec": 0 00:17:29.751 }, 00:17:29.751 "claimed": true, 00:17:29.751 "claim_type": "exclusive_write", 00:17:29.751 "zoned": false, 00:17:29.751 "supported_io_types": { 00:17:29.751 "read": true, 00:17:29.751 "write": true, 00:17:29.751 "unmap": true, 00:17:29.751 "write_zeroes": true, 00:17:29.751 "flush": true, 00:17:29.751 "reset": true, 00:17:29.751 "compare": false, 00:17:29.751 "compare_and_write": false, 00:17:29.751 "abort": true, 00:17:29.751 "nvme_admin": false, 00:17:29.751 "nvme_io": false 00:17:29.751 }, 00:17:29.751 "memory_domains": [ 00:17:29.751 { 00:17:29.751 "dma_device_id": "system", 00:17:29.751 "dma_device_type": 1 00:17:29.751 }, 00:17:29.751 { 00:17:29.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.751 "dma_device_type": 2 00:17:29.751 } 00:17:29.751 ], 00:17:29.751 "driver_specific": { 00:17:29.751 "passthru": { 00:17:29.751 "name": "pt4", 00:17:29.751 "base_bdev_name": "malloc4" 00:17:29.751 } 00:17:29.751 } 00:17:29.751 }' 00:17:29.751 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.751 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.010 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.270 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.270 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:30.270 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:30.270 [2024-06-10 13:45:44.713498] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:30.270 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=863dbc7e-10d0-4e43-94ec-fe880ef91a54 00:17:30.270 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 863dbc7e-10d0-4e43-94ec-fe880ef91a54 ']' 00:17:30.270 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:30.529 [2024-06-10 13:45:44.917771] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:30.529 [2024-06-10 13:45:44.917782] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:30.529 [2024-06-10 13:45:44.917820] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:30.529 [2024-06-10 13:45:44.917877] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:30.529 [2024-06-10 13:45:44.917884] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1096b60 name raid_bdev1, state offline 00:17:30.529 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.529 13:45:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:30.788 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:30.788 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:30.789 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:30.789 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:31.048 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:31.048 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:31.048 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:31.048 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:31.308 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:31.308 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:31.567 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:31.567 13:45:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:31.827 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:32.086 [2024-06-10 13:45:46.325284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:32.086 [2024-06-10 13:45:46.326423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:32.086 [2024-06-10 13:45:46.326458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:32.087 [2024-06-10 13:45:46.326487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:32.087 [2024-06-10 13:45:46.326523] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:32.087 [2024-06-10 13:45:46.326549] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:32.087 [2024-06-10 13:45:46.326564] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:32.087 [2024-06-10 13:45:46.326578] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:32.087 [2024-06-10 13:45:46.326588] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:32.087 [2024-06-10 13:45:46.326594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x109df10 name raid_bdev1, state configuring 00:17:32.087 request: 00:17:32.087 { 00:17:32.087 "name": "raid_bdev1", 00:17:32.087 "raid_level": "concat", 00:17:32.087 "base_bdevs": [ 00:17:32.087 "malloc1", 00:17:32.087 "malloc2", 00:17:32.087 "malloc3", 00:17:32.087 "malloc4" 00:17:32.087 ], 00:17:32.087 "superblock": false, 00:17:32.087 "strip_size_kb": 64, 00:17:32.087 "method": "bdev_raid_create", 00:17:32.087 "req_id": 1 00:17:32.087 } 00:17:32.087 Got JSON-RPC error response 00:17:32.087 response: 00:17:32.087 { 00:17:32.087 "code": -17, 00:17:32.087 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:32.087 } 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:32.087 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:32.346 [2024-06-10 13:45:46.734270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:32.346 [2024-06-10 13:45:46.734297] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:32.346 [2024-06-10 13:45:46.734307] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10a07c0 00:17:32.346 [2024-06-10 13:45:46.734314] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:32.346 [2024-06-10 13:45:46.735640] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:32.346 [2024-06-10 13:45:46.735660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:32.346 [2024-06-10 13:45:46.735706] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:32.346 [2024-06-10 13:45:46.735724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:32.346 pt1 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.346 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:32.605 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.605 "name": "raid_bdev1", 00:17:32.605 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:32.605 "strip_size_kb": 64, 00:17:32.605 "state": "configuring", 00:17:32.605 "raid_level": "concat", 00:17:32.605 "superblock": true, 00:17:32.605 "num_base_bdevs": 4, 00:17:32.605 "num_base_bdevs_discovered": 1, 00:17:32.605 "num_base_bdevs_operational": 4, 00:17:32.605 "base_bdevs_list": [ 00:17:32.605 { 00:17:32.605 "name": "pt1", 00:17:32.606 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:32.606 "is_configured": true, 00:17:32.606 "data_offset": 2048, 00:17:32.606 "data_size": 63488 00:17:32.606 }, 00:17:32.606 { 00:17:32.606 "name": null, 00:17:32.606 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:32.606 "is_configured": false, 00:17:32.606 "data_offset": 2048, 00:17:32.606 "data_size": 63488 00:17:32.606 }, 00:17:32.606 { 00:17:32.606 "name": null, 00:17:32.606 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:32.606 "is_configured": false, 00:17:32.606 "data_offset": 2048, 00:17:32.606 "data_size": 63488 00:17:32.606 }, 00:17:32.606 { 00:17:32.606 "name": null, 00:17:32.606 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:32.606 "is_configured": false, 00:17:32.606 "data_offset": 2048, 00:17:32.606 "data_size": 63488 00:17:32.606 } 00:17:32.606 ] 00:17:32.606 }' 00:17:32.606 13:45:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.606 13:45:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.175 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:33.175 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:33.435 [2024-06-10 13:45:47.704736] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:33.435 [2024-06-10 13:45:47.704768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.435 [2024-06-10 13:45:47.704782] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x109d9f0 00:17:33.435 [2024-06-10 13:45:47.704788] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.435 [2024-06-10 13:45:47.705064] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.435 [2024-06-10 13:45:47.705076] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:33.435 [2024-06-10 13:45:47.705118] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:33.435 [2024-06-10 13:45:47.705131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:33.435 pt2 00:17:33.435 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:33.435 [2024-06-10 13:45:47.909265] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.695 13:45:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:33.695 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.695 "name": "raid_bdev1", 00:17:33.695 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:33.695 "strip_size_kb": 64, 00:17:33.695 "state": "configuring", 00:17:33.695 "raid_level": "concat", 00:17:33.695 "superblock": true, 00:17:33.695 "num_base_bdevs": 4, 00:17:33.695 "num_base_bdevs_discovered": 1, 00:17:33.695 "num_base_bdevs_operational": 4, 00:17:33.695 "base_bdevs_list": [ 00:17:33.695 { 00:17:33.695 "name": "pt1", 00:17:33.695 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:33.695 "is_configured": true, 00:17:33.695 "data_offset": 2048, 00:17:33.695 "data_size": 63488 00:17:33.695 }, 00:17:33.695 { 00:17:33.695 "name": null, 00:17:33.695 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:33.695 "is_configured": false, 00:17:33.695 "data_offset": 2048, 00:17:33.695 "data_size": 63488 00:17:33.695 }, 00:17:33.695 { 00:17:33.695 "name": null, 00:17:33.695 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:33.695 "is_configured": false, 00:17:33.695 "data_offset": 2048, 00:17:33.695 "data_size": 63488 00:17:33.695 }, 00:17:33.695 { 00:17:33.695 "name": null, 00:17:33.695 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:33.695 "is_configured": false, 00:17:33.695 "data_offset": 2048, 00:17:33.695 "data_size": 63488 00:17:33.695 } 00:17:33.695 ] 00:17:33.695 }' 00:17:33.695 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.695 13:45:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.264 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:34.264 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:34.264 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:34.524 [2024-06-10 13:45:48.879729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:34.524 [2024-06-10 13:45:48.879760] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.524 [2024-06-10 13:45:48.879773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1097e10 00:17:34.524 [2024-06-10 13:45:48.879780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.524 [2024-06-10 13:45:48.880051] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.524 [2024-06-10 13:45:48.880062] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:34.524 [2024-06-10 13:45:48.880103] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:34.524 [2024-06-10 13:45:48.880115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:34.524 pt2 00:17:34.524 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:34.524 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:34.524 13:45:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:34.784 [2024-06-10 13:45:49.072222] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:34.784 [2024-06-10 13:45:49.072242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.784 [2024-06-10 13:45:49.072250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x115fc10 00:17:34.785 [2024-06-10 13:45:49.072257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.785 [2024-06-10 13:45:49.072475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.785 [2024-06-10 13:45:49.072486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:34.785 [2024-06-10 13:45:49.072519] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:34.785 [2024-06-10 13:45:49.072529] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:34.785 pt3 00:17:34.785 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:34.785 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:34.785 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:35.045 [2024-06-10 13:45:49.276741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:35.045 [2024-06-10 13:45:49.276759] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:35.045 [2024-06-10 13:45:49.276768] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1098040 00:17:35.045 [2024-06-10 13:45:49.276774] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:35.045 [2024-06-10 13:45:49.276989] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:35.045 [2024-06-10 13:45:49.277000] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:35.045 [2024-06-10 13:45:49.277031] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:35.045 [2024-06-10 13:45:49.277042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:35.045 [2024-06-10 13:45:49.277135] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x109f2b0 00:17:35.045 [2024-06-10 13:45:49.277141] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:35.045 [2024-06-10 13:45:49.277299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x109dc80 00:17:35.045 [2024-06-10 13:45:49.277405] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x109f2b0 00:17:35.045 [2024-06-10 13:45:49.277410] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x109f2b0 00:17:35.045 [2024-06-10 13:45:49.277487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:35.045 pt4 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.045 "name": "raid_bdev1", 00:17:35.045 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:35.045 "strip_size_kb": 64, 00:17:35.045 "state": "online", 00:17:35.045 "raid_level": "concat", 00:17:35.045 "superblock": true, 00:17:35.045 "num_base_bdevs": 4, 00:17:35.045 "num_base_bdevs_discovered": 4, 00:17:35.045 "num_base_bdevs_operational": 4, 00:17:35.045 "base_bdevs_list": [ 00:17:35.045 { 00:17:35.045 "name": "pt1", 00:17:35.045 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:35.045 "is_configured": true, 00:17:35.045 "data_offset": 2048, 00:17:35.045 "data_size": 63488 00:17:35.045 }, 00:17:35.045 { 00:17:35.045 "name": "pt2", 00:17:35.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.045 "is_configured": true, 00:17:35.045 "data_offset": 2048, 00:17:35.045 "data_size": 63488 00:17:35.045 }, 00:17:35.045 { 00:17:35.045 "name": "pt3", 00:17:35.045 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:35.045 "is_configured": true, 00:17:35.045 "data_offset": 2048, 00:17:35.045 "data_size": 63488 00:17:35.045 }, 00:17:35.045 { 00:17:35.045 "name": "pt4", 00:17:35.045 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:35.045 "is_configured": true, 00:17:35.045 "data_offset": 2048, 00:17:35.045 "data_size": 63488 00:17:35.045 } 00:17:35.045 ] 00:17:35.045 }' 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.045 13:45:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:35.615 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:35.875 [2024-06-10 13:45:50.235464] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:35.875 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:35.875 "name": "raid_bdev1", 00:17:35.875 "aliases": [ 00:17:35.875 "863dbc7e-10d0-4e43-94ec-fe880ef91a54" 00:17:35.875 ], 00:17:35.875 "product_name": "Raid Volume", 00:17:35.875 "block_size": 512, 00:17:35.875 "num_blocks": 253952, 00:17:35.875 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:35.875 "assigned_rate_limits": { 00:17:35.875 "rw_ios_per_sec": 0, 00:17:35.875 "rw_mbytes_per_sec": 0, 00:17:35.875 "r_mbytes_per_sec": 0, 00:17:35.875 "w_mbytes_per_sec": 0 00:17:35.875 }, 00:17:35.875 "claimed": false, 00:17:35.875 "zoned": false, 00:17:35.875 "supported_io_types": { 00:17:35.875 "read": true, 00:17:35.875 "write": true, 00:17:35.875 "unmap": true, 00:17:35.875 "write_zeroes": true, 00:17:35.875 "flush": true, 00:17:35.875 "reset": true, 00:17:35.875 "compare": false, 00:17:35.875 "compare_and_write": false, 00:17:35.875 "abort": false, 00:17:35.875 "nvme_admin": false, 00:17:35.875 "nvme_io": false 00:17:35.875 }, 00:17:35.875 "memory_domains": [ 00:17:35.875 { 00:17:35.875 "dma_device_id": "system", 00:17:35.875 "dma_device_type": 1 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.875 "dma_device_type": 2 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "system", 00:17:35.875 "dma_device_type": 1 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.875 "dma_device_type": 2 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "system", 00:17:35.875 "dma_device_type": 1 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.875 "dma_device_type": 2 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "system", 00:17:35.875 "dma_device_type": 1 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.875 "dma_device_type": 2 00:17:35.875 } 00:17:35.875 ], 00:17:35.875 "driver_specific": { 00:17:35.875 "raid": { 00:17:35.875 "uuid": "863dbc7e-10d0-4e43-94ec-fe880ef91a54", 00:17:35.875 "strip_size_kb": 64, 00:17:35.875 "state": "online", 00:17:35.875 "raid_level": "concat", 00:17:35.875 "superblock": true, 00:17:35.875 "num_base_bdevs": 4, 00:17:35.875 "num_base_bdevs_discovered": 4, 00:17:35.875 "num_base_bdevs_operational": 4, 00:17:35.875 "base_bdevs_list": [ 00:17:35.875 { 00:17:35.875 "name": "pt1", 00:17:35.875 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:35.875 "is_configured": true, 00:17:35.875 "data_offset": 2048, 00:17:35.875 "data_size": 63488 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "name": "pt2", 00:17:35.875 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.875 "is_configured": true, 00:17:35.875 "data_offset": 2048, 00:17:35.875 "data_size": 63488 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "name": "pt3", 00:17:35.875 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:35.875 "is_configured": true, 00:17:35.875 "data_offset": 2048, 00:17:35.875 "data_size": 63488 00:17:35.875 }, 00:17:35.875 { 00:17:35.875 "name": "pt4", 00:17:35.875 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:35.875 "is_configured": true, 00:17:35.875 "data_offset": 2048, 00:17:35.875 "data_size": 63488 00:17:35.875 } 00:17:35.875 ] 00:17:35.875 } 00:17:35.875 } 00:17:35.875 }' 00:17:35.875 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:35.875 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:35.875 pt2 00:17:35.875 pt3 00:17:35.875 pt4' 00:17:35.875 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:35.875 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:35.875 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.135 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.135 "name": "pt1", 00:17:36.135 "aliases": [ 00:17:36.135 "00000000-0000-0000-0000-000000000001" 00:17:36.135 ], 00:17:36.135 "product_name": "passthru", 00:17:36.135 "block_size": 512, 00:17:36.135 "num_blocks": 65536, 00:17:36.135 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:36.135 "assigned_rate_limits": { 00:17:36.135 "rw_ios_per_sec": 0, 00:17:36.135 "rw_mbytes_per_sec": 0, 00:17:36.135 "r_mbytes_per_sec": 0, 00:17:36.135 "w_mbytes_per_sec": 0 00:17:36.135 }, 00:17:36.135 "claimed": true, 00:17:36.135 "claim_type": "exclusive_write", 00:17:36.135 "zoned": false, 00:17:36.135 "supported_io_types": { 00:17:36.135 "read": true, 00:17:36.135 "write": true, 00:17:36.135 "unmap": true, 00:17:36.135 "write_zeroes": true, 00:17:36.135 "flush": true, 00:17:36.135 "reset": true, 00:17:36.135 "compare": false, 00:17:36.135 "compare_and_write": false, 00:17:36.135 "abort": true, 00:17:36.135 "nvme_admin": false, 00:17:36.135 "nvme_io": false 00:17:36.135 }, 00:17:36.135 "memory_domains": [ 00:17:36.135 { 00:17:36.135 "dma_device_id": "system", 00:17:36.135 "dma_device_type": 1 00:17:36.135 }, 00:17:36.135 { 00:17:36.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.135 "dma_device_type": 2 00:17:36.135 } 00:17:36.135 ], 00:17:36.135 "driver_specific": { 00:17:36.135 "passthru": { 00:17:36.135 "name": "pt1", 00:17:36.135 "base_bdev_name": "malloc1" 00:17:36.135 } 00:17:36.135 } 00:17:36.135 }' 00:17:36.135 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.135 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.135 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.135 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.395 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.655 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:36.655 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:36.655 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:36.655 13:45:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.655 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.655 "name": "pt2", 00:17:36.655 "aliases": [ 00:17:36.655 "00000000-0000-0000-0000-000000000002" 00:17:36.655 ], 00:17:36.655 "product_name": "passthru", 00:17:36.655 "block_size": 512, 00:17:36.655 "num_blocks": 65536, 00:17:36.655 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:36.655 "assigned_rate_limits": { 00:17:36.655 "rw_ios_per_sec": 0, 00:17:36.655 "rw_mbytes_per_sec": 0, 00:17:36.655 "r_mbytes_per_sec": 0, 00:17:36.655 "w_mbytes_per_sec": 0 00:17:36.655 }, 00:17:36.655 "claimed": true, 00:17:36.655 "claim_type": "exclusive_write", 00:17:36.655 "zoned": false, 00:17:36.655 "supported_io_types": { 00:17:36.655 "read": true, 00:17:36.655 "write": true, 00:17:36.655 "unmap": true, 00:17:36.655 "write_zeroes": true, 00:17:36.655 "flush": true, 00:17:36.655 "reset": true, 00:17:36.655 "compare": false, 00:17:36.655 "compare_and_write": false, 00:17:36.655 "abort": true, 00:17:36.655 "nvme_admin": false, 00:17:36.655 "nvme_io": false 00:17:36.655 }, 00:17:36.655 "memory_domains": [ 00:17:36.655 { 00:17:36.655 "dma_device_id": "system", 00:17:36.655 "dma_device_type": 1 00:17:36.655 }, 00:17:36.655 { 00:17:36.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.655 "dma_device_type": 2 00:17:36.655 } 00:17:36.655 ], 00:17:36.655 "driver_specific": { 00:17:36.655 "passthru": { 00:17:36.655 "name": "pt2", 00:17:36.655 "base_bdev_name": "malloc2" 00:17:36.655 } 00:17:36.655 } 00:17:36.655 }' 00:17:36.655 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.655 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.914 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:37.174 "name": "pt3", 00:17:37.174 "aliases": [ 00:17:37.174 "00000000-0000-0000-0000-000000000003" 00:17:37.174 ], 00:17:37.174 "product_name": "passthru", 00:17:37.174 "block_size": 512, 00:17:37.174 "num_blocks": 65536, 00:17:37.174 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:37.174 "assigned_rate_limits": { 00:17:37.174 "rw_ios_per_sec": 0, 00:17:37.174 "rw_mbytes_per_sec": 0, 00:17:37.174 "r_mbytes_per_sec": 0, 00:17:37.174 "w_mbytes_per_sec": 0 00:17:37.174 }, 00:17:37.174 "claimed": true, 00:17:37.174 "claim_type": "exclusive_write", 00:17:37.174 "zoned": false, 00:17:37.174 "supported_io_types": { 00:17:37.174 "read": true, 00:17:37.174 "write": true, 00:17:37.174 "unmap": true, 00:17:37.174 "write_zeroes": true, 00:17:37.174 "flush": true, 00:17:37.174 "reset": true, 00:17:37.174 "compare": false, 00:17:37.174 "compare_and_write": false, 00:17:37.174 "abort": true, 00:17:37.174 "nvme_admin": false, 00:17:37.174 "nvme_io": false 00:17:37.174 }, 00:17:37.174 "memory_domains": [ 00:17:37.174 { 00:17:37.174 "dma_device_id": "system", 00:17:37.174 "dma_device_type": 1 00:17:37.174 }, 00:17:37.174 { 00:17:37.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.174 "dma_device_type": 2 00:17:37.174 } 00:17:37.174 ], 00:17:37.174 "driver_specific": { 00:17:37.174 "passthru": { 00:17:37.174 "name": "pt3", 00:17:37.174 "base_bdev_name": "malloc3" 00:17:37.174 } 00:17:37.174 } 00:17:37.174 }' 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.174 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.434 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.694 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:37.694 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:37.694 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:37.694 13:45:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:37.694 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:37.694 "name": "pt4", 00:17:37.694 "aliases": [ 00:17:37.694 "00000000-0000-0000-0000-000000000004" 00:17:37.694 ], 00:17:37.694 "product_name": "passthru", 00:17:37.694 "block_size": 512, 00:17:37.694 "num_blocks": 65536, 00:17:37.694 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:37.694 "assigned_rate_limits": { 00:17:37.694 "rw_ios_per_sec": 0, 00:17:37.694 "rw_mbytes_per_sec": 0, 00:17:37.694 "r_mbytes_per_sec": 0, 00:17:37.694 "w_mbytes_per_sec": 0 00:17:37.694 }, 00:17:37.694 "claimed": true, 00:17:37.694 "claim_type": "exclusive_write", 00:17:37.694 "zoned": false, 00:17:37.694 "supported_io_types": { 00:17:37.694 "read": true, 00:17:37.694 "write": true, 00:17:37.694 "unmap": true, 00:17:37.694 "write_zeroes": true, 00:17:37.694 "flush": true, 00:17:37.694 "reset": true, 00:17:37.694 "compare": false, 00:17:37.694 "compare_and_write": false, 00:17:37.694 "abort": true, 00:17:37.694 "nvme_admin": false, 00:17:37.694 "nvme_io": false 00:17:37.694 }, 00:17:37.694 "memory_domains": [ 00:17:37.694 { 00:17:37.694 "dma_device_id": "system", 00:17:37.694 "dma_device_type": 1 00:17:37.694 }, 00:17:37.694 { 00:17:37.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.694 "dma_device_type": 2 00:17:37.694 } 00:17:37.694 ], 00:17:37.694 "driver_specific": { 00:17:37.694 "passthru": { 00:17:37.694 "name": "pt4", 00:17:37.694 "base_bdev_name": "malloc4" 00:17:37.694 } 00:17:37.694 } 00:17:37.694 }' 00:17:37.694 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.954 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:38.214 [2024-06-10 13:45:52.649565] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 863dbc7e-10d0-4e43-94ec-fe880ef91a54 '!=' 863dbc7e-10d0-4e43-94ec-fe880ef91a54 ']' 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1594123 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1594123 ']' 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1594123 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:38.214 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1594123 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1594123' 00:17:38.474 killing process with pid 1594123 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1594123 00:17:38.474 [2024-06-10 13:45:52.705935] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:38.474 [2024-06-10 13:45:52.705984] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:38.474 [2024-06-10 13:45:52.706035] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:38.474 [2024-06-10 13:45:52.706042] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x109f2b0 name raid_bdev1, state offline 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1594123 00:17:38.474 [2024-06-10 13:45:52.727521] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:38.474 00:17:38.474 real 0m14.318s 00:17:38.474 user 0m26.436s 00:17:38.474 sys 0m2.055s 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:38.474 13:45:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.474 ************************************ 00:17:38.474 END TEST raid_superblock_test 00:17:38.474 ************************************ 00:17:38.474 13:45:52 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:17:38.474 13:45:52 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:38.474 13:45:52 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:38.474 13:45:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:38.474 ************************************ 00:17:38.474 START TEST raid_read_error_test 00:17:38.474 ************************************ 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 read 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.iXnkvGMIDS 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1597407 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1597407 /var/tmp/spdk-raid.sock 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1597407 ']' 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:38.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:38.474 13:45:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.734 [2024-06-10 13:45:52.998601] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:17:38.734 [2024-06-10 13:45:52.998650] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1597407 ] 00:17:38.734 [2024-06-10 13:45:53.086289] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:38.734 [2024-06-10 13:45:53.151702] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:17:38.734 [2024-06-10 13:45:53.192189] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:38.734 [2024-06-10 13:45:53.192214] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:39.672 13:45:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:39.672 13:45:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:17:39.672 13:45:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:39.672 13:45:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:39.672 BaseBdev1_malloc 00:17:39.672 13:45:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:39.932 true 00:17:39.933 13:45:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:40.192 [2024-06-10 13:45:54.443735] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:40.192 [2024-06-10 13:45:54.443768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.192 [2024-06-10 13:45:54.443780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d2c90 00:17:40.192 [2024-06-10 13:45:54.443787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.192 [2024-06-10 13:45:54.445248] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.192 [2024-06-10 13:45:54.445270] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:40.192 BaseBdev1 00:17:40.192 13:45:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:40.192 13:45:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:40.192 BaseBdev2_malloc 00:17:40.192 13:45:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:40.452 true 00:17:40.452 13:45:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:40.712 [2024-06-10 13:45:55.023265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:40.712 [2024-06-10 13:45:55.023295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.712 [2024-06-10 13:45:55.023308] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d7400 00:17:40.712 [2024-06-10 13:45:55.023314] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.712 [2024-06-10 13:45:55.024567] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.712 [2024-06-10 13:45:55.024586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:40.712 BaseBdev2 00:17:40.712 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:40.712 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:40.971 BaseBdev3_malloc 00:17:40.972 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:40.972 true 00:17:40.972 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:41.231 [2024-06-10 13:45:55.626816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:41.231 [2024-06-10 13:45:55.626843] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:41.231 [2024-06-10 13:45:55.626856] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d9fc0 00:17:41.231 [2024-06-10 13:45:55.626863] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:41.231 [2024-06-10 13:45:55.628111] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:41.231 [2024-06-10 13:45:55.628130] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:41.231 BaseBdev3 00:17:41.231 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:41.231 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:41.490 BaseBdev4_malloc 00:17:41.490 13:45:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:41.750 true 00:17:41.750 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:41.750 [2024-06-10 13:45:56.218346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:41.750 [2024-06-10 13:45:56.218377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:41.750 [2024-06-10 13:45:56.218393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10da710 00:17:41.750 [2024-06-10 13:45:56.218400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:41.750 [2024-06-10 13:45:56.219665] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:41.750 [2024-06-10 13:45:56.219686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:41.750 BaseBdev4 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:42.010 [2024-06-10 13:45:56.418875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:42.010 [2024-06-10 13:45:56.419967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:42.010 [2024-06-10 13:45:56.420023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:42.010 [2024-06-10 13:45:56.420074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:42.010 [2024-06-10 13:45:56.420268] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10d43b0 00:17:42.010 [2024-06-10 13:45:56.420276] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:42.010 [2024-06-10 13:45:56.420428] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10d4350 00:17:42.010 [2024-06-10 13:45:56.420548] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10d43b0 00:17:42.010 [2024-06-10 13:45:56.420554] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10d43b0 00:17:42.010 [2024-06-10 13:45:56.420632] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.010 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:42.270 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.270 "name": "raid_bdev1", 00:17:42.270 "uuid": "7c4c2f1e-8d8a-4f77-a892-64c3d753a08c", 00:17:42.270 "strip_size_kb": 64, 00:17:42.270 "state": "online", 00:17:42.270 "raid_level": "concat", 00:17:42.270 "superblock": true, 00:17:42.270 "num_base_bdevs": 4, 00:17:42.270 "num_base_bdevs_discovered": 4, 00:17:42.270 "num_base_bdevs_operational": 4, 00:17:42.270 "base_bdevs_list": [ 00:17:42.270 { 00:17:42.270 "name": "BaseBdev1", 00:17:42.270 "uuid": "8468ec9b-c596-56c1-82e2-aab56b455db9", 00:17:42.270 "is_configured": true, 00:17:42.270 "data_offset": 2048, 00:17:42.270 "data_size": 63488 00:17:42.270 }, 00:17:42.270 { 00:17:42.270 "name": "BaseBdev2", 00:17:42.270 "uuid": "877fa685-0393-5348-9d2e-563a2a54457c", 00:17:42.270 "is_configured": true, 00:17:42.270 "data_offset": 2048, 00:17:42.270 "data_size": 63488 00:17:42.270 }, 00:17:42.270 { 00:17:42.270 "name": "BaseBdev3", 00:17:42.270 "uuid": "492546cd-66ec-5e36-93bf-8e9d7a2093b2", 00:17:42.270 "is_configured": true, 00:17:42.270 "data_offset": 2048, 00:17:42.270 "data_size": 63488 00:17:42.270 }, 00:17:42.270 { 00:17:42.270 "name": "BaseBdev4", 00:17:42.270 "uuid": "c01d49ba-19e9-51bd-9c1b-dfffec26a560", 00:17:42.270 "is_configured": true, 00:17:42.270 "data_offset": 2048, 00:17:42.270 "data_size": 63488 00:17:42.270 } 00:17:42.270 ] 00:17:42.270 }' 00:17:42.270 13:45:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.270 13:45:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.840 13:45:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:42.840 13:45:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:42.840 [2024-06-10 13:45:57.281254] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf28830 00:17:43.780 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.039 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:44.299 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.299 "name": "raid_bdev1", 00:17:44.299 "uuid": "7c4c2f1e-8d8a-4f77-a892-64c3d753a08c", 00:17:44.299 "strip_size_kb": 64, 00:17:44.299 "state": "online", 00:17:44.299 "raid_level": "concat", 00:17:44.299 "superblock": true, 00:17:44.299 "num_base_bdevs": 4, 00:17:44.299 "num_base_bdevs_discovered": 4, 00:17:44.299 "num_base_bdevs_operational": 4, 00:17:44.299 "base_bdevs_list": [ 00:17:44.299 { 00:17:44.299 "name": "BaseBdev1", 00:17:44.299 "uuid": "8468ec9b-c596-56c1-82e2-aab56b455db9", 00:17:44.299 "is_configured": true, 00:17:44.299 "data_offset": 2048, 00:17:44.299 "data_size": 63488 00:17:44.299 }, 00:17:44.299 { 00:17:44.299 "name": "BaseBdev2", 00:17:44.299 "uuid": "877fa685-0393-5348-9d2e-563a2a54457c", 00:17:44.299 "is_configured": true, 00:17:44.299 "data_offset": 2048, 00:17:44.299 "data_size": 63488 00:17:44.299 }, 00:17:44.299 { 00:17:44.299 "name": "BaseBdev3", 00:17:44.299 "uuid": "492546cd-66ec-5e36-93bf-8e9d7a2093b2", 00:17:44.299 "is_configured": true, 00:17:44.299 "data_offset": 2048, 00:17:44.299 "data_size": 63488 00:17:44.299 }, 00:17:44.299 { 00:17:44.299 "name": "BaseBdev4", 00:17:44.299 "uuid": "c01d49ba-19e9-51bd-9c1b-dfffec26a560", 00:17:44.299 "is_configured": true, 00:17:44.299 "data_offset": 2048, 00:17:44.299 "data_size": 63488 00:17:44.299 } 00:17:44.299 ] 00:17:44.299 }' 00:17:44.299 13:45:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.299 13:45:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.869 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:44.869 [2024-06-10 13:45:59.344069] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:44.869 [2024-06-10 13:45:59.344095] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:45.129 [2024-06-10 13:45:59.346895] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:45.129 [2024-06-10 13:45:59.346929] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:45.129 [2024-06-10 13:45:59.346958] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:45.129 [2024-06-10 13:45:59.346964] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10d43b0 name raid_bdev1, state offline 00:17:45.129 0 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1597407 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1597407 ']' 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1597407 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1597407 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1597407' 00:17:45.129 killing process with pid 1597407 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1597407 00:17:45.129 [2024-06-10 13:45:59.414135] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1597407 00:17:45.129 [2024-06-10 13:45:59.431676] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.iXnkvGMIDS 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:45.129 00:17:45.129 real 0m6.639s 00:17:45.129 user 0m10.746s 00:17:45.129 sys 0m0.931s 00:17:45.129 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:45.130 13:45:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.130 ************************************ 00:17:45.130 END TEST raid_read_error_test 00:17:45.130 ************************************ 00:17:45.389 13:45:59 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:17:45.389 13:45:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:45.389 13:45:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:45.389 13:45:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:45.389 ************************************ 00:17:45.389 START TEST raid_write_error_test 00:17:45.389 ************************************ 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 write 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:45.389 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.oq6hI8H7Er 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1598811 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1598811 /var/tmp/spdk-raid.sock 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1598811 ']' 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:45.390 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:45.390 13:45:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.390 [2024-06-10 13:45:59.719414] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:17:45.390 [2024-06-10 13:45:59.719470] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598811 ] 00:17:45.390 [2024-06-10 13:45:59.808052] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:45.649 [2024-06-10 13:45:59.878239] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:17:45.649 [2024-06-10 13:45:59.921394] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:45.649 [2024-06-10 13:45:59.921416] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:46.218 13:46:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:46.218 13:46:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:17:46.218 13:46:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:46.218 13:46:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:46.478 BaseBdev1_malloc 00:17:46.478 13:46:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:46.738 true 00:17:46.738 13:46:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:46.738 [2024-06-10 13:46:01.157115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:46.738 [2024-06-10 13:46:01.157148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.738 [2024-06-10 13:46:01.157160] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbf4c90 00:17:46.738 [2024-06-10 13:46:01.157171] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.738 [2024-06-10 13:46:01.158617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.738 [2024-06-10 13:46:01.158638] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:46.738 BaseBdev1 00:17:46.738 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:46.738 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:46.997 BaseBdev2_malloc 00:17:46.997 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:47.256 true 00:17:47.256 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:47.533 [2024-06-10 13:46:01.732639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:47.533 [2024-06-10 13:46:01.732669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:47.533 [2024-06-10 13:46:01.732686] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbf9400 00:17:47.533 [2024-06-10 13:46:01.732693] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:47.533 [2024-06-10 13:46:01.733945] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:47.533 [2024-06-10 13:46:01.733964] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:47.533 BaseBdev2 00:17:47.533 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:47.533 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:47.533 BaseBdev3_malloc 00:17:47.533 13:46:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:47.843 true 00:17:47.843 13:46:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:48.131 [2024-06-10 13:46:02.336224] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:48.131 [2024-06-10 13:46:02.336252] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.132 [2024-06-10 13:46:02.336265] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbfbfc0 00:17:48.132 [2024-06-10 13:46:02.336272] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.132 [2024-06-10 13:46:02.337527] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.132 [2024-06-10 13:46:02.337547] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:48.132 BaseBdev3 00:17:48.132 13:46:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:48.132 13:46:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:48.132 BaseBdev4_malloc 00:17:48.132 13:46:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:48.392 true 00:17:48.392 13:46:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:48.652 [2024-06-10 13:46:02.927790] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:48.652 [2024-06-10 13:46:02.927818] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.652 [2024-06-10 13:46:02.927832] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbfc710 00:17:48.652 [2024-06-10 13:46:02.927838] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.652 [2024-06-10 13:46:02.929088] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.652 [2024-06-10 13:46:02.929108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:48.652 BaseBdev4 00:17:48.652 13:46:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:48.653 [2024-06-10 13:46:03.116289] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:48.653 [2024-06-10 13:46:03.117346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:48.653 [2024-06-10 13:46:03.117400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:48.653 [2024-06-10 13:46:03.117451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:48.653 [2024-06-10 13:46:03.117637] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbf63b0 00:17:48.653 [2024-06-10 13:46:03.117648] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:48.653 [2024-06-10 13:46:03.117797] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbf6350 00:17:48.653 [2024-06-10 13:46:03.117917] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbf63b0 00:17:48.653 [2024-06-10 13:46:03.117923] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbf63b0 00:17:48.653 [2024-06-10 13:46:03.118000] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.912 "name": "raid_bdev1", 00:17:48.912 "uuid": "d1e157b7-c855-41d0-ae80-12920ac9c823", 00:17:48.912 "strip_size_kb": 64, 00:17:48.912 "state": "online", 00:17:48.912 "raid_level": "concat", 00:17:48.912 "superblock": true, 00:17:48.912 "num_base_bdevs": 4, 00:17:48.912 "num_base_bdevs_discovered": 4, 00:17:48.912 "num_base_bdevs_operational": 4, 00:17:48.912 "base_bdevs_list": [ 00:17:48.912 { 00:17:48.912 "name": "BaseBdev1", 00:17:48.912 "uuid": "767e6707-e8e3-58e1-ac8f-6f830cc1499f", 00:17:48.912 "is_configured": true, 00:17:48.912 "data_offset": 2048, 00:17:48.912 "data_size": 63488 00:17:48.912 }, 00:17:48.912 { 00:17:48.912 "name": "BaseBdev2", 00:17:48.912 "uuid": "7a535ea1-1783-5496-be53-7e07a06ad6c5", 00:17:48.912 "is_configured": true, 00:17:48.912 "data_offset": 2048, 00:17:48.912 "data_size": 63488 00:17:48.912 }, 00:17:48.912 { 00:17:48.912 "name": "BaseBdev3", 00:17:48.912 "uuid": "b055e23a-21d2-5071-813d-7232ea1dc183", 00:17:48.912 "is_configured": true, 00:17:48.912 "data_offset": 2048, 00:17:48.912 "data_size": 63488 00:17:48.912 }, 00:17:48.912 { 00:17:48.912 "name": "BaseBdev4", 00:17:48.912 "uuid": "091b9860-1631-59da-9886-2096f81ae31e", 00:17:48.912 "is_configured": true, 00:17:48.912 "data_offset": 2048, 00:17:48.912 "data_size": 63488 00:17:48.912 } 00:17:48.912 ] 00:17:48.912 }' 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.912 13:46:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.481 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:49.481 13:46:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:49.741 [2024-06-10 13:46:03.962636] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4a830 00:17:50.680 13:46:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.680 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:50.940 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.940 "name": "raid_bdev1", 00:17:50.940 "uuid": "d1e157b7-c855-41d0-ae80-12920ac9c823", 00:17:50.940 "strip_size_kb": 64, 00:17:50.940 "state": "online", 00:17:50.940 "raid_level": "concat", 00:17:50.940 "superblock": true, 00:17:50.940 "num_base_bdevs": 4, 00:17:50.940 "num_base_bdevs_discovered": 4, 00:17:50.940 "num_base_bdevs_operational": 4, 00:17:50.940 "base_bdevs_list": [ 00:17:50.940 { 00:17:50.940 "name": "BaseBdev1", 00:17:50.940 "uuid": "767e6707-e8e3-58e1-ac8f-6f830cc1499f", 00:17:50.940 "is_configured": true, 00:17:50.940 "data_offset": 2048, 00:17:50.940 "data_size": 63488 00:17:50.940 }, 00:17:50.940 { 00:17:50.940 "name": "BaseBdev2", 00:17:50.940 "uuid": "7a535ea1-1783-5496-be53-7e07a06ad6c5", 00:17:50.940 "is_configured": true, 00:17:50.940 "data_offset": 2048, 00:17:50.940 "data_size": 63488 00:17:50.940 }, 00:17:50.940 { 00:17:50.940 "name": "BaseBdev3", 00:17:50.940 "uuid": "b055e23a-21d2-5071-813d-7232ea1dc183", 00:17:50.940 "is_configured": true, 00:17:50.940 "data_offset": 2048, 00:17:50.940 "data_size": 63488 00:17:50.940 }, 00:17:50.940 { 00:17:50.940 "name": "BaseBdev4", 00:17:50.940 "uuid": "091b9860-1631-59da-9886-2096f81ae31e", 00:17:50.940 "is_configured": true, 00:17:50.940 "data_offset": 2048, 00:17:50.940 "data_size": 63488 00:17:50.940 } 00:17:50.940 ] 00:17:50.940 }' 00:17:50.940 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.940 13:46:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.510 13:46:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:51.771 [2024-06-10 13:46:05.995884] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:51.771 [2024-06-10 13:46:05.995913] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:51.771 [2024-06-10 13:46:05.998709] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:51.771 [2024-06-10 13:46:05.998741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:51.771 [2024-06-10 13:46:05.998772] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:51.771 [2024-06-10 13:46:05.998777] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbf63b0 name raid_bdev1, state offline 00:17:51.771 0 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1598811 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1598811 ']' 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1598811 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1598811 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1598811' 00:17:51.771 killing process with pid 1598811 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1598811 00:17:51.771 [2024-06-10 13:46:06.063852] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1598811 00:17:51.771 [2024-06-10 13:46:06.081112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.oq6hI8H7Er 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:51.771 00:17:51.771 real 0m6.565s 00:17:51.771 user 0m10.626s 00:17:51.771 sys 0m0.903s 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:51.771 13:46:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.771 ************************************ 00:17:51.771 END TEST raid_write_error_test 00:17:51.771 ************************************ 00:17:52.032 13:46:06 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:52.032 13:46:06 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:17:52.032 13:46:06 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:52.032 13:46:06 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:52.032 13:46:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:52.032 ************************************ 00:17:52.032 START TEST raid_state_function_test 00:17:52.032 ************************************ 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 false 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1600152 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1600152' 00:17:52.032 Process raid pid: 1600152 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1600152 /var/tmp/spdk-raid.sock 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 1600152 ']' 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:52.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:52.032 13:46:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.032 [2024-06-10 13:46:06.358207] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:17:52.032 [2024-06-10 13:46:06.358256] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:52.032 [2024-06-10 13:46:06.446769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:52.292 [2024-06-10 13:46:06.512508] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:17:52.292 [2024-06-10 13:46:06.554451] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:52.292 [2024-06-10 13:46:06.554472] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:52.862 13:46:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:52.862 13:46:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:17:52.862 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:53.121 [2024-06-10 13:46:07.398296] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:53.122 [2024-06-10 13:46:07.398325] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:53.122 [2024-06-10 13:46:07.398332] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:53.122 [2024-06-10 13:46:07.398338] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:53.122 [2024-06-10 13:46:07.398343] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:53.122 [2024-06-10 13:46:07.398349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:53.122 [2024-06-10 13:46:07.398358] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:53.122 [2024-06-10 13:46:07.398365] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.122 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.381 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.381 "name": "Existed_Raid", 00:17:53.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.381 "strip_size_kb": 0, 00:17:53.381 "state": "configuring", 00:17:53.381 "raid_level": "raid1", 00:17:53.381 "superblock": false, 00:17:53.381 "num_base_bdevs": 4, 00:17:53.381 "num_base_bdevs_discovered": 0, 00:17:53.382 "num_base_bdevs_operational": 4, 00:17:53.382 "base_bdevs_list": [ 00:17:53.382 { 00:17:53.382 "name": "BaseBdev1", 00:17:53.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.382 "is_configured": false, 00:17:53.382 "data_offset": 0, 00:17:53.382 "data_size": 0 00:17:53.382 }, 00:17:53.382 { 00:17:53.382 "name": "BaseBdev2", 00:17:53.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.382 "is_configured": false, 00:17:53.382 "data_offset": 0, 00:17:53.382 "data_size": 0 00:17:53.382 }, 00:17:53.382 { 00:17:53.382 "name": "BaseBdev3", 00:17:53.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.382 "is_configured": false, 00:17:53.382 "data_offset": 0, 00:17:53.382 "data_size": 0 00:17:53.382 }, 00:17:53.382 { 00:17:53.382 "name": "BaseBdev4", 00:17:53.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.382 "is_configured": false, 00:17:53.382 "data_offset": 0, 00:17:53.382 "data_size": 0 00:17:53.382 } 00:17:53.382 ] 00:17:53.382 }' 00:17:53.382 13:46:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.382 13:46:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.951 13:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:53.951 [2024-06-10 13:46:08.368639] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:53.951 [2024-06-10 13:46:08.368657] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c1a760 name Existed_Raid, state configuring 00:17:53.951 13:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:54.212 [2024-06-10 13:46:08.569169] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:54.212 [2024-06-10 13:46:08.569186] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:54.212 [2024-06-10 13:46:08.569191] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:54.212 [2024-06-10 13:46:08.569197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:54.212 [2024-06-10 13:46:08.569202] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:54.212 [2024-06-10 13:46:08.569212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:54.212 [2024-06-10 13:46:08.569216] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:54.212 [2024-06-10 13:46:08.569222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:54.212 13:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:54.472 [2024-06-10 13:46:08.792617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:54.472 BaseBdev1 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:54.472 13:46:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.732 13:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:54.732 [ 00:17:54.732 { 00:17:54.732 "name": "BaseBdev1", 00:17:54.732 "aliases": [ 00:17:54.732 "7f85738b-5849-40a0-87da-e458a9b043b8" 00:17:54.732 ], 00:17:54.732 "product_name": "Malloc disk", 00:17:54.732 "block_size": 512, 00:17:54.732 "num_blocks": 65536, 00:17:54.732 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:17:54.732 "assigned_rate_limits": { 00:17:54.732 "rw_ios_per_sec": 0, 00:17:54.732 "rw_mbytes_per_sec": 0, 00:17:54.732 "r_mbytes_per_sec": 0, 00:17:54.732 "w_mbytes_per_sec": 0 00:17:54.732 }, 00:17:54.732 "claimed": true, 00:17:54.732 "claim_type": "exclusive_write", 00:17:54.732 "zoned": false, 00:17:54.732 "supported_io_types": { 00:17:54.732 "read": true, 00:17:54.732 "write": true, 00:17:54.732 "unmap": true, 00:17:54.732 "write_zeroes": true, 00:17:54.732 "flush": true, 00:17:54.732 "reset": true, 00:17:54.732 "compare": false, 00:17:54.732 "compare_and_write": false, 00:17:54.732 "abort": true, 00:17:54.732 "nvme_admin": false, 00:17:54.732 "nvme_io": false 00:17:54.732 }, 00:17:54.732 "memory_domains": [ 00:17:54.732 { 00:17:54.732 "dma_device_id": "system", 00:17:54.732 "dma_device_type": 1 00:17:54.732 }, 00:17:54.732 { 00:17:54.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.732 "dma_device_type": 2 00:17:54.732 } 00:17:54.732 ], 00:17:54.732 "driver_specific": {} 00:17:54.732 } 00:17:54.732 ] 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.992 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.992 "name": "Existed_Raid", 00:17:54.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.992 "strip_size_kb": 0, 00:17:54.992 "state": "configuring", 00:17:54.992 "raid_level": "raid1", 00:17:54.992 "superblock": false, 00:17:54.992 "num_base_bdevs": 4, 00:17:54.992 "num_base_bdevs_discovered": 1, 00:17:54.992 "num_base_bdevs_operational": 4, 00:17:54.992 "base_bdevs_list": [ 00:17:54.992 { 00:17:54.992 "name": "BaseBdev1", 00:17:54.992 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:17:54.992 "is_configured": true, 00:17:54.992 "data_offset": 0, 00:17:54.992 "data_size": 65536 00:17:54.992 }, 00:17:54.992 { 00:17:54.992 "name": "BaseBdev2", 00:17:54.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.993 "is_configured": false, 00:17:54.993 "data_offset": 0, 00:17:54.993 "data_size": 0 00:17:54.993 }, 00:17:54.993 { 00:17:54.993 "name": "BaseBdev3", 00:17:54.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.993 "is_configured": false, 00:17:54.993 "data_offset": 0, 00:17:54.993 "data_size": 0 00:17:54.993 }, 00:17:54.993 { 00:17:54.993 "name": "BaseBdev4", 00:17:54.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.993 "is_configured": false, 00:17:54.993 "data_offset": 0, 00:17:54.993 "data_size": 0 00:17:54.993 } 00:17:54.993 ] 00:17:54.993 }' 00:17:54.993 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.993 13:46:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.562 13:46:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:55.821 [2024-06-10 13:46:10.164111] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:55.821 [2024-06-10 13:46:10.164144] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c19fd0 name Existed_Raid, state configuring 00:17:55.821 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:56.081 [2024-06-10 13:46:10.356625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:56.081 [2024-06-10 13:46:10.357847] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:56.081 [2024-06-10 13:46:10.357871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:56.081 [2024-06-10 13:46:10.357877] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:56.081 [2024-06-10 13:46:10.357883] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:56.081 [2024-06-10 13:46:10.357889] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:56.081 [2024-06-10 13:46:10.357894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.081 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.341 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.341 "name": "Existed_Raid", 00:17:56.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.341 "strip_size_kb": 0, 00:17:56.341 "state": "configuring", 00:17:56.341 "raid_level": "raid1", 00:17:56.341 "superblock": false, 00:17:56.341 "num_base_bdevs": 4, 00:17:56.341 "num_base_bdevs_discovered": 1, 00:17:56.341 "num_base_bdevs_operational": 4, 00:17:56.341 "base_bdevs_list": [ 00:17:56.341 { 00:17:56.341 "name": "BaseBdev1", 00:17:56.341 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:17:56.341 "is_configured": true, 00:17:56.341 "data_offset": 0, 00:17:56.341 "data_size": 65536 00:17:56.341 }, 00:17:56.341 { 00:17:56.341 "name": "BaseBdev2", 00:17:56.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.341 "is_configured": false, 00:17:56.341 "data_offset": 0, 00:17:56.341 "data_size": 0 00:17:56.341 }, 00:17:56.341 { 00:17:56.341 "name": "BaseBdev3", 00:17:56.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.341 "is_configured": false, 00:17:56.341 "data_offset": 0, 00:17:56.341 "data_size": 0 00:17:56.341 }, 00:17:56.341 { 00:17:56.341 "name": "BaseBdev4", 00:17:56.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.341 "is_configured": false, 00:17:56.341 "data_offset": 0, 00:17:56.341 "data_size": 0 00:17:56.341 } 00:17:56.341 ] 00:17:56.341 }' 00:17:56.341 13:46:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.341 13:46:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:56.911 [2024-06-10 13:46:11.344297] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:56.911 BaseBdev2 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:56.911 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.171 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:57.432 [ 00:17:57.432 { 00:17:57.432 "name": "BaseBdev2", 00:17:57.432 "aliases": [ 00:17:57.432 "3a9481e3-11c7-420e-9dbf-2fcb8b188290" 00:17:57.432 ], 00:17:57.432 "product_name": "Malloc disk", 00:17:57.432 "block_size": 512, 00:17:57.432 "num_blocks": 65536, 00:17:57.432 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:17:57.432 "assigned_rate_limits": { 00:17:57.432 "rw_ios_per_sec": 0, 00:17:57.432 "rw_mbytes_per_sec": 0, 00:17:57.432 "r_mbytes_per_sec": 0, 00:17:57.432 "w_mbytes_per_sec": 0 00:17:57.432 }, 00:17:57.432 "claimed": true, 00:17:57.432 "claim_type": "exclusive_write", 00:17:57.432 "zoned": false, 00:17:57.432 "supported_io_types": { 00:17:57.432 "read": true, 00:17:57.432 "write": true, 00:17:57.432 "unmap": true, 00:17:57.432 "write_zeroes": true, 00:17:57.432 "flush": true, 00:17:57.432 "reset": true, 00:17:57.432 "compare": false, 00:17:57.432 "compare_and_write": false, 00:17:57.432 "abort": true, 00:17:57.432 "nvme_admin": false, 00:17:57.432 "nvme_io": false 00:17:57.432 }, 00:17:57.432 "memory_domains": [ 00:17:57.432 { 00:17:57.432 "dma_device_id": "system", 00:17:57.432 "dma_device_type": 1 00:17:57.432 }, 00:17:57.432 { 00:17:57.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.432 "dma_device_type": 2 00:17:57.432 } 00:17:57.432 ], 00:17:57.432 "driver_specific": {} 00:17:57.432 } 00:17:57.432 ] 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.432 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.692 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.692 "name": "Existed_Raid", 00:17:57.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.692 "strip_size_kb": 0, 00:17:57.692 "state": "configuring", 00:17:57.692 "raid_level": "raid1", 00:17:57.692 "superblock": false, 00:17:57.692 "num_base_bdevs": 4, 00:17:57.692 "num_base_bdevs_discovered": 2, 00:17:57.692 "num_base_bdevs_operational": 4, 00:17:57.692 "base_bdevs_list": [ 00:17:57.692 { 00:17:57.692 "name": "BaseBdev1", 00:17:57.692 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:17:57.692 "is_configured": true, 00:17:57.692 "data_offset": 0, 00:17:57.692 "data_size": 65536 00:17:57.692 }, 00:17:57.692 { 00:17:57.692 "name": "BaseBdev2", 00:17:57.692 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:17:57.692 "is_configured": true, 00:17:57.692 "data_offset": 0, 00:17:57.692 "data_size": 65536 00:17:57.692 }, 00:17:57.692 { 00:17:57.692 "name": "BaseBdev3", 00:17:57.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.692 "is_configured": false, 00:17:57.692 "data_offset": 0, 00:17:57.692 "data_size": 0 00:17:57.692 }, 00:17:57.692 { 00:17:57.692 "name": "BaseBdev4", 00:17:57.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.692 "is_configured": false, 00:17:57.692 "data_offset": 0, 00:17:57.692 "data_size": 0 00:17:57.692 } 00:17:57.692 ] 00:17:57.692 }' 00:17:57.692 13:46:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.692 13:46:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.261 13:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:58.261 [2024-06-10 13:46:12.720914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:58.261 BaseBdev3 00:17:58.261 13:46:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:58.262 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:58.262 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:58.262 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:58.262 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:58.262 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:58.262 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:58.521 13:46:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:58.781 [ 00:17:58.781 { 00:17:58.781 "name": "BaseBdev3", 00:17:58.781 "aliases": [ 00:17:58.781 "304f740b-7b6c-4424-82ce-19a8571a24e7" 00:17:58.781 ], 00:17:58.781 "product_name": "Malloc disk", 00:17:58.781 "block_size": 512, 00:17:58.781 "num_blocks": 65536, 00:17:58.781 "uuid": "304f740b-7b6c-4424-82ce-19a8571a24e7", 00:17:58.781 "assigned_rate_limits": { 00:17:58.781 "rw_ios_per_sec": 0, 00:17:58.781 "rw_mbytes_per_sec": 0, 00:17:58.781 "r_mbytes_per_sec": 0, 00:17:58.781 "w_mbytes_per_sec": 0 00:17:58.781 }, 00:17:58.781 "claimed": true, 00:17:58.781 "claim_type": "exclusive_write", 00:17:58.781 "zoned": false, 00:17:58.781 "supported_io_types": { 00:17:58.781 "read": true, 00:17:58.781 "write": true, 00:17:58.781 "unmap": true, 00:17:58.781 "write_zeroes": true, 00:17:58.781 "flush": true, 00:17:58.781 "reset": true, 00:17:58.781 "compare": false, 00:17:58.781 "compare_and_write": false, 00:17:58.781 "abort": true, 00:17:58.781 "nvme_admin": false, 00:17:58.781 "nvme_io": false 00:17:58.781 }, 00:17:58.781 "memory_domains": [ 00:17:58.781 { 00:17:58.781 "dma_device_id": "system", 00:17:58.781 "dma_device_type": 1 00:17:58.781 }, 00:17:58.781 { 00:17:58.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.781 "dma_device_type": 2 00:17:58.781 } 00:17:58.781 ], 00:17:58.781 "driver_specific": {} 00:17:58.781 } 00:17:58.781 ] 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.781 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.041 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.041 "name": "Existed_Raid", 00:17:59.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.041 "strip_size_kb": 0, 00:17:59.041 "state": "configuring", 00:17:59.041 "raid_level": "raid1", 00:17:59.041 "superblock": false, 00:17:59.041 "num_base_bdevs": 4, 00:17:59.041 "num_base_bdevs_discovered": 3, 00:17:59.041 "num_base_bdevs_operational": 4, 00:17:59.041 "base_bdevs_list": [ 00:17:59.041 { 00:17:59.041 "name": "BaseBdev1", 00:17:59.041 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:17:59.041 "is_configured": true, 00:17:59.041 "data_offset": 0, 00:17:59.041 "data_size": 65536 00:17:59.041 }, 00:17:59.041 { 00:17:59.042 "name": "BaseBdev2", 00:17:59.042 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:17:59.042 "is_configured": true, 00:17:59.042 "data_offset": 0, 00:17:59.042 "data_size": 65536 00:17:59.042 }, 00:17:59.042 { 00:17:59.042 "name": "BaseBdev3", 00:17:59.042 "uuid": "304f740b-7b6c-4424-82ce-19a8571a24e7", 00:17:59.042 "is_configured": true, 00:17:59.042 "data_offset": 0, 00:17:59.042 "data_size": 65536 00:17:59.042 }, 00:17:59.042 { 00:17:59.042 "name": "BaseBdev4", 00:17:59.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.042 "is_configured": false, 00:17:59.042 "data_offset": 0, 00:17:59.042 "data_size": 0 00:17:59.042 } 00:17:59.042 ] 00:17:59.042 }' 00:17:59.042 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.042 13:46:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.611 13:46:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:59.611 [2024-06-10 13:46:14.061354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:59.611 [2024-06-10 13:46:14.061379] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c1b030 00:17:59.611 [2024-06-10 13:46:14.061388] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:59.611 [2024-06-10 13:46:14.061546] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dce250 00:17:59.611 [2024-06-10 13:46:14.061650] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c1b030 00:17:59.611 [2024-06-10 13:46:14.061656] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c1b030 00:17:59.611 [2024-06-10 13:46:14.061780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:59.611 BaseBdev4 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:59.611 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:59.870 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:00.130 [ 00:18:00.130 { 00:18:00.130 "name": "BaseBdev4", 00:18:00.130 "aliases": [ 00:18:00.130 "50309063-3033-43b2-a03b-c70fe67dac0c" 00:18:00.130 ], 00:18:00.130 "product_name": "Malloc disk", 00:18:00.130 "block_size": 512, 00:18:00.130 "num_blocks": 65536, 00:18:00.130 "uuid": "50309063-3033-43b2-a03b-c70fe67dac0c", 00:18:00.130 "assigned_rate_limits": { 00:18:00.130 "rw_ios_per_sec": 0, 00:18:00.130 "rw_mbytes_per_sec": 0, 00:18:00.130 "r_mbytes_per_sec": 0, 00:18:00.130 "w_mbytes_per_sec": 0 00:18:00.130 }, 00:18:00.131 "claimed": true, 00:18:00.131 "claim_type": "exclusive_write", 00:18:00.131 "zoned": false, 00:18:00.131 "supported_io_types": { 00:18:00.131 "read": true, 00:18:00.131 "write": true, 00:18:00.131 "unmap": true, 00:18:00.131 "write_zeroes": true, 00:18:00.131 "flush": true, 00:18:00.131 "reset": true, 00:18:00.131 "compare": false, 00:18:00.131 "compare_and_write": false, 00:18:00.131 "abort": true, 00:18:00.131 "nvme_admin": false, 00:18:00.131 "nvme_io": false 00:18:00.131 }, 00:18:00.131 "memory_domains": [ 00:18:00.131 { 00:18:00.131 "dma_device_id": "system", 00:18:00.131 "dma_device_type": 1 00:18:00.131 }, 00:18:00.131 { 00:18:00.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.131 "dma_device_type": 2 00:18:00.131 } 00:18:00.131 ], 00:18:00.131 "driver_specific": {} 00:18:00.131 } 00:18:00.131 ] 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.131 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.391 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.391 "name": "Existed_Raid", 00:18:00.391 "uuid": "d53c4aa4-7a24-4eb5-a9ca-5b90c88ff3af", 00:18:00.391 "strip_size_kb": 0, 00:18:00.391 "state": "online", 00:18:00.391 "raid_level": "raid1", 00:18:00.391 "superblock": false, 00:18:00.391 "num_base_bdevs": 4, 00:18:00.391 "num_base_bdevs_discovered": 4, 00:18:00.391 "num_base_bdevs_operational": 4, 00:18:00.391 "base_bdevs_list": [ 00:18:00.391 { 00:18:00.391 "name": "BaseBdev1", 00:18:00.391 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:18:00.391 "is_configured": true, 00:18:00.391 "data_offset": 0, 00:18:00.391 "data_size": 65536 00:18:00.391 }, 00:18:00.391 { 00:18:00.391 "name": "BaseBdev2", 00:18:00.391 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:18:00.391 "is_configured": true, 00:18:00.391 "data_offset": 0, 00:18:00.391 "data_size": 65536 00:18:00.391 }, 00:18:00.391 { 00:18:00.391 "name": "BaseBdev3", 00:18:00.391 "uuid": "304f740b-7b6c-4424-82ce-19a8571a24e7", 00:18:00.391 "is_configured": true, 00:18:00.391 "data_offset": 0, 00:18:00.391 "data_size": 65536 00:18:00.391 }, 00:18:00.391 { 00:18:00.391 "name": "BaseBdev4", 00:18:00.391 "uuid": "50309063-3033-43b2-a03b-c70fe67dac0c", 00:18:00.391 "is_configured": true, 00:18:00.391 "data_offset": 0, 00:18:00.391 "data_size": 65536 00:18:00.391 } 00:18:00.391 ] 00:18:00.391 }' 00:18:00.391 13:46:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.391 13:46:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:00.961 [2024-06-10 13:46:15.409031] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:00.961 "name": "Existed_Raid", 00:18:00.961 "aliases": [ 00:18:00.961 "d53c4aa4-7a24-4eb5-a9ca-5b90c88ff3af" 00:18:00.961 ], 00:18:00.961 "product_name": "Raid Volume", 00:18:00.961 "block_size": 512, 00:18:00.961 "num_blocks": 65536, 00:18:00.961 "uuid": "d53c4aa4-7a24-4eb5-a9ca-5b90c88ff3af", 00:18:00.961 "assigned_rate_limits": { 00:18:00.961 "rw_ios_per_sec": 0, 00:18:00.961 "rw_mbytes_per_sec": 0, 00:18:00.961 "r_mbytes_per_sec": 0, 00:18:00.961 "w_mbytes_per_sec": 0 00:18:00.961 }, 00:18:00.961 "claimed": false, 00:18:00.961 "zoned": false, 00:18:00.961 "supported_io_types": { 00:18:00.961 "read": true, 00:18:00.961 "write": true, 00:18:00.961 "unmap": false, 00:18:00.961 "write_zeroes": true, 00:18:00.961 "flush": false, 00:18:00.961 "reset": true, 00:18:00.961 "compare": false, 00:18:00.961 "compare_and_write": false, 00:18:00.961 "abort": false, 00:18:00.961 "nvme_admin": false, 00:18:00.961 "nvme_io": false 00:18:00.961 }, 00:18:00.961 "memory_domains": [ 00:18:00.961 { 00:18:00.961 "dma_device_id": "system", 00:18:00.961 "dma_device_type": 1 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.961 "dma_device_type": 2 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "system", 00:18:00.961 "dma_device_type": 1 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.961 "dma_device_type": 2 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "system", 00:18:00.961 "dma_device_type": 1 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.961 "dma_device_type": 2 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "system", 00:18:00.961 "dma_device_type": 1 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.961 "dma_device_type": 2 00:18:00.961 } 00:18:00.961 ], 00:18:00.961 "driver_specific": { 00:18:00.961 "raid": { 00:18:00.961 "uuid": "d53c4aa4-7a24-4eb5-a9ca-5b90c88ff3af", 00:18:00.961 "strip_size_kb": 0, 00:18:00.961 "state": "online", 00:18:00.961 "raid_level": "raid1", 00:18:00.961 "superblock": false, 00:18:00.961 "num_base_bdevs": 4, 00:18:00.961 "num_base_bdevs_discovered": 4, 00:18:00.961 "num_base_bdevs_operational": 4, 00:18:00.961 "base_bdevs_list": [ 00:18:00.961 { 00:18:00.961 "name": "BaseBdev1", 00:18:00.961 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:18:00.961 "is_configured": true, 00:18:00.961 "data_offset": 0, 00:18:00.961 "data_size": 65536 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "name": "BaseBdev2", 00:18:00.961 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:18:00.961 "is_configured": true, 00:18:00.961 "data_offset": 0, 00:18:00.961 "data_size": 65536 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "name": "BaseBdev3", 00:18:00.961 "uuid": "304f740b-7b6c-4424-82ce-19a8571a24e7", 00:18:00.961 "is_configured": true, 00:18:00.961 "data_offset": 0, 00:18:00.961 "data_size": 65536 00:18:00.961 }, 00:18:00.961 { 00:18:00.961 "name": "BaseBdev4", 00:18:00.961 "uuid": "50309063-3033-43b2-a03b-c70fe67dac0c", 00:18:00.961 "is_configured": true, 00:18:00.961 "data_offset": 0, 00:18:00.961 "data_size": 65536 00:18:00.961 } 00:18:00.961 ] 00:18:00.961 } 00:18:00.961 } 00:18:00.961 }' 00:18:00.961 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:01.221 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:01.222 BaseBdev2 00:18:01.222 BaseBdev3 00:18:01.222 BaseBdev4' 00:18:01.222 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.222 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:01.222 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.222 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.222 "name": "BaseBdev1", 00:18:01.222 "aliases": [ 00:18:01.222 "7f85738b-5849-40a0-87da-e458a9b043b8" 00:18:01.222 ], 00:18:01.222 "product_name": "Malloc disk", 00:18:01.222 "block_size": 512, 00:18:01.222 "num_blocks": 65536, 00:18:01.222 "uuid": "7f85738b-5849-40a0-87da-e458a9b043b8", 00:18:01.222 "assigned_rate_limits": { 00:18:01.222 "rw_ios_per_sec": 0, 00:18:01.222 "rw_mbytes_per_sec": 0, 00:18:01.222 "r_mbytes_per_sec": 0, 00:18:01.222 "w_mbytes_per_sec": 0 00:18:01.222 }, 00:18:01.222 "claimed": true, 00:18:01.222 "claim_type": "exclusive_write", 00:18:01.222 "zoned": false, 00:18:01.222 "supported_io_types": { 00:18:01.222 "read": true, 00:18:01.222 "write": true, 00:18:01.222 "unmap": true, 00:18:01.222 "write_zeroes": true, 00:18:01.222 "flush": true, 00:18:01.222 "reset": true, 00:18:01.222 "compare": false, 00:18:01.222 "compare_and_write": false, 00:18:01.222 "abort": true, 00:18:01.222 "nvme_admin": false, 00:18:01.222 "nvme_io": false 00:18:01.222 }, 00:18:01.222 "memory_domains": [ 00:18:01.222 { 00:18:01.222 "dma_device_id": "system", 00:18:01.222 "dma_device_type": 1 00:18:01.222 }, 00:18:01.222 { 00:18:01.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.222 "dma_device_type": 2 00:18:01.222 } 00:18:01.222 ], 00:18:01.222 "driver_specific": {} 00:18:01.222 }' 00:18:01.222 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.481 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.741 13:46:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.741 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.741 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.741 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:01.741 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.741 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.741 "name": "BaseBdev2", 00:18:01.741 "aliases": [ 00:18:01.741 "3a9481e3-11c7-420e-9dbf-2fcb8b188290" 00:18:01.741 ], 00:18:01.741 "product_name": "Malloc disk", 00:18:01.741 "block_size": 512, 00:18:01.741 "num_blocks": 65536, 00:18:01.741 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:18:01.741 "assigned_rate_limits": { 00:18:01.741 "rw_ios_per_sec": 0, 00:18:01.741 "rw_mbytes_per_sec": 0, 00:18:01.741 "r_mbytes_per_sec": 0, 00:18:01.741 "w_mbytes_per_sec": 0 00:18:01.741 }, 00:18:01.741 "claimed": true, 00:18:01.741 "claim_type": "exclusive_write", 00:18:01.741 "zoned": false, 00:18:01.741 "supported_io_types": { 00:18:01.741 "read": true, 00:18:01.741 "write": true, 00:18:01.741 "unmap": true, 00:18:01.741 "write_zeroes": true, 00:18:01.741 "flush": true, 00:18:01.741 "reset": true, 00:18:01.741 "compare": false, 00:18:01.741 "compare_and_write": false, 00:18:01.741 "abort": true, 00:18:01.741 "nvme_admin": false, 00:18:01.741 "nvme_io": false 00:18:01.741 }, 00:18:01.741 "memory_domains": [ 00:18:01.741 { 00:18:01.741 "dma_device_id": "system", 00:18:01.741 "dma_device_type": 1 00:18:01.741 }, 00:18:01.741 { 00:18:01.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.741 "dma_device_type": 2 00:18:01.741 } 00:18:01.741 ], 00:18:01.741 "driver_specific": {} 00:18:01.741 }' 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.000 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:02.259 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.519 "name": "BaseBdev3", 00:18:02.519 "aliases": [ 00:18:02.519 "304f740b-7b6c-4424-82ce-19a8571a24e7" 00:18:02.519 ], 00:18:02.519 "product_name": "Malloc disk", 00:18:02.519 "block_size": 512, 00:18:02.519 "num_blocks": 65536, 00:18:02.519 "uuid": "304f740b-7b6c-4424-82ce-19a8571a24e7", 00:18:02.519 "assigned_rate_limits": { 00:18:02.519 "rw_ios_per_sec": 0, 00:18:02.519 "rw_mbytes_per_sec": 0, 00:18:02.519 "r_mbytes_per_sec": 0, 00:18:02.519 "w_mbytes_per_sec": 0 00:18:02.519 }, 00:18:02.519 "claimed": true, 00:18:02.519 "claim_type": "exclusive_write", 00:18:02.519 "zoned": false, 00:18:02.519 "supported_io_types": { 00:18:02.519 "read": true, 00:18:02.519 "write": true, 00:18:02.519 "unmap": true, 00:18:02.519 "write_zeroes": true, 00:18:02.519 "flush": true, 00:18:02.519 "reset": true, 00:18:02.519 "compare": false, 00:18:02.519 "compare_and_write": false, 00:18:02.519 "abort": true, 00:18:02.519 "nvme_admin": false, 00:18:02.519 "nvme_io": false 00:18:02.519 }, 00:18:02.519 "memory_domains": [ 00:18:02.519 { 00:18:02.519 "dma_device_id": "system", 00:18:02.519 "dma_device_type": 1 00:18:02.519 }, 00:18:02.519 { 00:18:02.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.519 "dma_device_type": 2 00:18:02.519 } 00:18:02.519 ], 00:18:02.519 "driver_specific": {} 00:18:02.519 }' 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.519 13:46:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:02.779 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.039 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:03.039 "name": "BaseBdev4", 00:18:03.039 "aliases": [ 00:18:03.039 "50309063-3033-43b2-a03b-c70fe67dac0c" 00:18:03.039 ], 00:18:03.039 "product_name": "Malloc disk", 00:18:03.039 "block_size": 512, 00:18:03.039 "num_blocks": 65536, 00:18:03.039 "uuid": "50309063-3033-43b2-a03b-c70fe67dac0c", 00:18:03.039 "assigned_rate_limits": { 00:18:03.039 "rw_ios_per_sec": 0, 00:18:03.039 "rw_mbytes_per_sec": 0, 00:18:03.039 "r_mbytes_per_sec": 0, 00:18:03.039 "w_mbytes_per_sec": 0 00:18:03.039 }, 00:18:03.039 "claimed": true, 00:18:03.039 "claim_type": "exclusive_write", 00:18:03.039 "zoned": false, 00:18:03.039 "supported_io_types": { 00:18:03.039 "read": true, 00:18:03.039 "write": true, 00:18:03.039 "unmap": true, 00:18:03.039 "write_zeroes": true, 00:18:03.039 "flush": true, 00:18:03.039 "reset": true, 00:18:03.039 "compare": false, 00:18:03.039 "compare_and_write": false, 00:18:03.039 "abort": true, 00:18:03.039 "nvme_admin": false, 00:18:03.039 "nvme_io": false 00:18:03.039 }, 00:18:03.039 "memory_domains": [ 00:18:03.039 { 00:18:03.039 "dma_device_id": "system", 00:18:03.039 "dma_device_type": 1 00:18:03.039 }, 00:18:03.039 { 00:18:03.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.039 "dma_device_type": 2 00:18:03.039 } 00:18:03.039 ], 00:18:03.039 "driver_specific": {} 00:18:03.039 }' 00:18:03.039 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.039 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.039 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.039 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.039 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.300 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:03.561 [2024-06-10 13:46:17.887192] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.561 13:46:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.821 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.821 "name": "Existed_Raid", 00:18:03.821 "uuid": "d53c4aa4-7a24-4eb5-a9ca-5b90c88ff3af", 00:18:03.821 "strip_size_kb": 0, 00:18:03.821 "state": "online", 00:18:03.821 "raid_level": "raid1", 00:18:03.821 "superblock": false, 00:18:03.821 "num_base_bdevs": 4, 00:18:03.821 "num_base_bdevs_discovered": 3, 00:18:03.821 "num_base_bdevs_operational": 3, 00:18:03.821 "base_bdevs_list": [ 00:18:03.821 { 00:18:03.821 "name": null, 00:18:03.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.821 "is_configured": false, 00:18:03.822 "data_offset": 0, 00:18:03.822 "data_size": 65536 00:18:03.822 }, 00:18:03.822 { 00:18:03.822 "name": "BaseBdev2", 00:18:03.822 "uuid": "3a9481e3-11c7-420e-9dbf-2fcb8b188290", 00:18:03.822 "is_configured": true, 00:18:03.822 "data_offset": 0, 00:18:03.822 "data_size": 65536 00:18:03.822 }, 00:18:03.822 { 00:18:03.822 "name": "BaseBdev3", 00:18:03.822 "uuid": "304f740b-7b6c-4424-82ce-19a8571a24e7", 00:18:03.822 "is_configured": true, 00:18:03.822 "data_offset": 0, 00:18:03.822 "data_size": 65536 00:18:03.822 }, 00:18:03.822 { 00:18:03.822 "name": "BaseBdev4", 00:18:03.822 "uuid": "50309063-3033-43b2-a03b-c70fe67dac0c", 00:18:03.822 "is_configured": true, 00:18:03.822 "data_offset": 0, 00:18:03.822 "data_size": 65536 00:18:03.822 } 00:18:03.822 ] 00:18:03.822 }' 00:18:03.822 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.822 13:46:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.392 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:04.392 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:04.392 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.392 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:04.652 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:04.652 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:04.652 13:46:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:04.652 [2024-06-10 13:46:19.074209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:04.652 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:04.652 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:04.652 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.652 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:04.912 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:04.912 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:04.913 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:05.173 [2024-06-10 13:46:19.481251] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:05.173 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:05.173 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:05.173 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.173 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:05.433 [2024-06-10 13:46:19.880261] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:05.433 [2024-06-10 13:46:19.880319] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:05.433 [2024-06-10 13:46:19.886632] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:05.433 [2024-06-10 13:46:19.886657] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:05.433 [2024-06-10 13:46:19.886662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c1b030 name Existed_Raid, state offline 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.433 13:46:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:05.694 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:05.694 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:05.694 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:05.694 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:05.694 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:05.694 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:05.955 BaseBdev2 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:05.955 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:06.215 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:06.215 [ 00:18:06.215 { 00:18:06.215 "name": "BaseBdev2", 00:18:06.215 "aliases": [ 00:18:06.215 "e8b0449c-a4c0-4418-b4b9-dc17d94f5356" 00:18:06.215 ], 00:18:06.215 "product_name": "Malloc disk", 00:18:06.215 "block_size": 512, 00:18:06.215 "num_blocks": 65536, 00:18:06.215 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:06.215 "assigned_rate_limits": { 00:18:06.215 "rw_ios_per_sec": 0, 00:18:06.215 "rw_mbytes_per_sec": 0, 00:18:06.215 "r_mbytes_per_sec": 0, 00:18:06.215 "w_mbytes_per_sec": 0 00:18:06.215 }, 00:18:06.215 "claimed": false, 00:18:06.215 "zoned": false, 00:18:06.215 "supported_io_types": { 00:18:06.215 "read": true, 00:18:06.215 "write": true, 00:18:06.215 "unmap": true, 00:18:06.215 "write_zeroes": true, 00:18:06.215 "flush": true, 00:18:06.215 "reset": true, 00:18:06.215 "compare": false, 00:18:06.215 "compare_and_write": false, 00:18:06.215 "abort": true, 00:18:06.215 "nvme_admin": false, 00:18:06.215 "nvme_io": false 00:18:06.215 }, 00:18:06.215 "memory_domains": [ 00:18:06.215 { 00:18:06.215 "dma_device_id": "system", 00:18:06.215 "dma_device_type": 1 00:18:06.215 }, 00:18:06.215 { 00:18:06.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.215 "dma_device_type": 2 00:18:06.215 } 00:18:06.215 ], 00:18:06.215 "driver_specific": {} 00:18:06.215 } 00:18:06.215 ] 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:06.475 BaseBdev3 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:06.475 13:46:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:06.736 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:06.996 [ 00:18:06.996 { 00:18:06.996 "name": "BaseBdev3", 00:18:06.996 "aliases": [ 00:18:06.996 "10774db7-c457-4104-a174-e4a5857c1e34" 00:18:06.996 ], 00:18:06.996 "product_name": "Malloc disk", 00:18:06.996 "block_size": 512, 00:18:06.996 "num_blocks": 65536, 00:18:06.996 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:06.996 "assigned_rate_limits": { 00:18:06.996 "rw_ios_per_sec": 0, 00:18:06.996 "rw_mbytes_per_sec": 0, 00:18:06.996 "r_mbytes_per_sec": 0, 00:18:06.996 "w_mbytes_per_sec": 0 00:18:06.996 }, 00:18:06.996 "claimed": false, 00:18:06.996 "zoned": false, 00:18:06.996 "supported_io_types": { 00:18:06.996 "read": true, 00:18:06.996 "write": true, 00:18:06.996 "unmap": true, 00:18:06.996 "write_zeroes": true, 00:18:06.996 "flush": true, 00:18:06.996 "reset": true, 00:18:06.996 "compare": false, 00:18:06.996 "compare_and_write": false, 00:18:06.996 "abort": true, 00:18:06.996 "nvme_admin": false, 00:18:06.996 "nvme_io": false 00:18:06.996 }, 00:18:06.996 "memory_domains": [ 00:18:06.996 { 00:18:06.996 "dma_device_id": "system", 00:18:06.996 "dma_device_type": 1 00:18:06.996 }, 00:18:06.996 { 00:18:06.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.996 "dma_device_type": 2 00:18:06.996 } 00:18:06.996 ], 00:18:06.996 "driver_specific": {} 00:18:06.996 } 00:18:06.996 ] 00:18:06.996 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:06.996 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:06.996 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:06.996 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:07.256 BaseBdev4 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.256 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:07.516 [ 00:18:07.516 { 00:18:07.516 "name": "BaseBdev4", 00:18:07.516 "aliases": [ 00:18:07.516 "cee39ba2-1a50-4a74-ba14-ad56023adcf7" 00:18:07.516 ], 00:18:07.516 "product_name": "Malloc disk", 00:18:07.516 "block_size": 512, 00:18:07.516 "num_blocks": 65536, 00:18:07.516 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:07.516 "assigned_rate_limits": { 00:18:07.516 "rw_ios_per_sec": 0, 00:18:07.516 "rw_mbytes_per_sec": 0, 00:18:07.516 "r_mbytes_per_sec": 0, 00:18:07.516 "w_mbytes_per_sec": 0 00:18:07.516 }, 00:18:07.516 "claimed": false, 00:18:07.516 "zoned": false, 00:18:07.516 "supported_io_types": { 00:18:07.516 "read": true, 00:18:07.516 "write": true, 00:18:07.516 "unmap": true, 00:18:07.516 "write_zeroes": true, 00:18:07.516 "flush": true, 00:18:07.516 "reset": true, 00:18:07.516 "compare": false, 00:18:07.516 "compare_and_write": false, 00:18:07.516 "abort": true, 00:18:07.516 "nvme_admin": false, 00:18:07.516 "nvme_io": false 00:18:07.516 }, 00:18:07.516 "memory_domains": [ 00:18:07.516 { 00:18:07.516 "dma_device_id": "system", 00:18:07.516 "dma_device_type": 1 00:18:07.516 }, 00:18:07.516 { 00:18:07.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.516 "dma_device_type": 2 00:18:07.516 } 00:18:07.516 ], 00:18:07.516 "driver_specific": {} 00:18:07.516 } 00:18:07.516 ] 00:18:07.516 13:46:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:07.516 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:07.516 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:07.516 13:46:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:07.776 [2024-06-10 13:46:22.092286] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:07.776 [2024-06-10 13:46:22.092315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:07.776 [2024-06-10 13:46:22.092329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:07.776 [2024-06-10 13:46:22.093421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:07.776 [2024-06-10 13:46:22.093455] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.776 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.038 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.038 "name": "Existed_Raid", 00:18:08.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.038 "strip_size_kb": 0, 00:18:08.038 "state": "configuring", 00:18:08.038 "raid_level": "raid1", 00:18:08.038 "superblock": false, 00:18:08.038 "num_base_bdevs": 4, 00:18:08.038 "num_base_bdevs_discovered": 3, 00:18:08.038 "num_base_bdevs_operational": 4, 00:18:08.038 "base_bdevs_list": [ 00:18:08.038 { 00:18:08.038 "name": "BaseBdev1", 00:18:08.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.038 "is_configured": false, 00:18:08.038 "data_offset": 0, 00:18:08.038 "data_size": 0 00:18:08.038 }, 00:18:08.038 { 00:18:08.038 "name": "BaseBdev2", 00:18:08.038 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:08.038 "is_configured": true, 00:18:08.038 "data_offset": 0, 00:18:08.038 "data_size": 65536 00:18:08.038 }, 00:18:08.038 { 00:18:08.038 "name": "BaseBdev3", 00:18:08.038 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:08.038 "is_configured": true, 00:18:08.038 "data_offset": 0, 00:18:08.038 "data_size": 65536 00:18:08.038 }, 00:18:08.038 { 00:18:08.038 "name": "BaseBdev4", 00:18:08.038 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:08.038 "is_configured": true, 00:18:08.038 "data_offset": 0, 00:18:08.038 "data_size": 65536 00:18:08.038 } 00:18:08.038 ] 00:18:08.038 }' 00:18:08.038 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.038 13:46:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.609 13:46:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:08.609 [2024-06-10 13:46:23.062718] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.609 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.869 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.869 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.869 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.869 "name": "Existed_Raid", 00:18:08.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.869 "strip_size_kb": 0, 00:18:08.869 "state": "configuring", 00:18:08.869 "raid_level": "raid1", 00:18:08.869 "superblock": false, 00:18:08.869 "num_base_bdevs": 4, 00:18:08.869 "num_base_bdevs_discovered": 2, 00:18:08.869 "num_base_bdevs_operational": 4, 00:18:08.869 "base_bdevs_list": [ 00:18:08.869 { 00:18:08.869 "name": "BaseBdev1", 00:18:08.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.869 "is_configured": false, 00:18:08.869 "data_offset": 0, 00:18:08.869 "data_size": 0 00:18:08.869 }, 00:18:08.869 { 00:18:08.869 "name": null, 00:18:08.869 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:08.869 "is_configured": false, 00:18:08.869 "data_offset": 0, 00:18:08.869 "data_size": 65536 00:18:08.869 }, 00:18:08.869 { 00:18:08.869 "name": "BaseBdev3", 00:18:08.869 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:08.869 "is_configured": true, 00:18:08.869 "data_offset": 0, 00:18:08.869 "data_size": 65536 00:18:08.869 }, 00:18:08.869 { 00:18:08.869 "name": "BaseBdev4", 00:18:08.869 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:08.869 "is_configured": true, 00:18:08.869 "data_offset": 0, 00:18:08.869 "data_size": 65536 00:18:08.869 } 00:18:08.869 ] 00:18:08.869 }' 00:18:08.869 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.869 13:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.440 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.440 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:09.700 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:09.700 13:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:09.700 [2024-06-10 13:46:24.158620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.700 BaseBdev1 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:09.700 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:09.960 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:10.220 [ 00:18:10.220 { 00:18:10.220 "name": "BaseBdev1", 00:18:10.220 "aliases": [ 00:18:10.220 "46cd3bff-a0ef-4eb6-9469-865b7994cc61" 00:18:10.220 ], 00:18:10.220 "product_name": "Malloc disk", 00:18:10.221 "block_size": 512, 00:18:10.221 "num_blocks": 65536, 00:18:10.221 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:10.221 "assigned_rate_limits": { 00:18:10.221 "rw_ios_per_sec": 0, 00:18:10.221 "rw_mbytes_per_sec": 0, 00:18:10.221 "r_mbytes_per_sec": 0, 00:18:10.221 "w_mbytes_per_sec": 0 00:18:10.221 }, 00:18:10.221 "claimed": true, 00:18:10.221 "claim_type": "exclusive_write", 00:18:10.221 "zoned": false, 00:18:10.221 "supported_io_types": { 00:18:10.221 "read": true, 00:18:10.221 "write": true, 00:18:10.221 "unmap": true, 00:18:10.221 "write_zeroes": true, 00:18:10.221 "flush": true, 00:18:10.221 "reset": true, 00:18:10.221 "compare": false, 00:18:10.221 "compare_and_write": false, 00:18:10.221 "abort": true, 00:18:10.221 "nvme_admin": false, 00:18:10.221 "nvme_io": false 00:18:10.221 }, 00:18:10.221 "memory_domains": [ 00:18:10.221 { 00:18:10.221 "dma_device_id": "system", 00:18:10.221 "dma_device_type": 1 00:18:10.221 }, 00:18:10.221 { 00:18:10.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.221 "dma_device_type": 2 00:18:10.221 } 00:18:10.221 ], 00:18:10.221 "driver_specific": {} 00:18:10.221 } 00:18:10.221 ] 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.221 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.481 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.481 "name": "Existed_Raid", 00:18:10.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.481 "strip_size_kb": 0, 00:18:10.481 "state": "configuring", 00:18:10.481 "raid_level": "raid1", 00:18:10.481 "superblock": false, 00:18:10.481 "num_base_bdevs": 4, 00:18:10.481 "num_base_bdevs_discovered": 3, 00:18:10.481 "num_base_bdevs_operational": 4, 00:18:10.481 "base_bdevs_list": [ 00:18:10.481 { 00:18:10.481 "name": "BaseBdev1", 00:18:10.481 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:10.481 "is_configured": true, 00:18:10.481 "data_offset": 0, 00:18:10.481 "data_size": 65536 00:18:10.481 }, 00:18:10.481 { 00:18:10.481 "name": null, 00:18:10.481 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:10.481 "is_configured": false, 00:18:10.481 "data_offset": 0, 00:18:10.481 "data_size": 65536 00:18:10.481 }, 00:18:10.481 { 00:18:10.481 "name": "BaseBdev3", 00:18:10.481 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:10.481 "is_configured": true, 00:18:10.481 "data_offset": 0, 00:18:10.481 "data_size": 65536 00:18:10.481 }, 00:18:10.481 { 00:18:10.481 "name": "BaseBdev4", 00:18:10.481 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:10.481 "is_configured": true, 00:18:10.481 "data_offset": 0, 00:18:10.481 "data_size": 65536 00:18:10.481 } 00:18:10.481 ] 00:18:10.481 }' 00:18:10.481 13:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.481 13:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.051 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.051 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:11.311 [2024-06-10 13:46:25.738643] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.311 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.571 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.571 "name": "Existed_Raid", 00:18:11.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.571 "strip_size_kb": 0, 00:18:11.571 "state": "configuring", 00:18:11.571 "raid_level": "raid1", 00:18:11.571 "superblock": false, 00:18:11.571 "num_base_bdevs": 4, 00:18:11.571 "num_base_bdevs_discovered": 2, 00:18:11.571 "num_base_bdevs_operational": 4, 00:18:11.571 "base_bdevs_list": [ 00:18:11.571 { 00:18:11.571 "name": "BaseBdev1", 00:18:11.571 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:11.571 "is_configured": true, 00:18:11.571 "data_offset": 0, 00:18:11.571 "data_size": 65536 00:18:11.571 }, 00:18:11.571 { 00:18:11.571 "name": null, 00:18:11.571 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:11.571 "is_configured": false, 00:18:11.571 "data_offset": 0, 00:18:11.571 "data_size": 65536 00:18:11.571 }, 00:18:11.571 { 00:18:11.571 "name": null, 00:18:11.571 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:11.571 "is_configured": false, 00:18:11.571 "data_offset": 0, 00:18:11.571 "data_size": 65536 00:18:11.571 }, 00:18:11.571 { 00:18:11.571 "name": "BaseBdev4", 00:18:11.571 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:11.571 "is_configured": true, 00:18:11.571 "data_offset": 0, 00:18:11.571 "data_size": 65536 00:18:11.571 } 00:18:11.571 ] 00:18:11.571 }' 00:18:11.571 13:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.571 13:46:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.142 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.142 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:12.402 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:12.402 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:12.402 [2024-06-10 13:46:26.873557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.662 13:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.662 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.662 "name": "Existed_Raid", 00:18:12.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.662 "strip_size_kb": 0, 00:18:12.662 "state": "configuring", 00:18:12.662 "raid_level": "raid1", 00:18:12.662 "superblock": false, 00:18:12.662 "num_base_bdevs": 4, 00:18:12.662 "num_base_bdevs_discovered": 3, 00:18:12.662 "num_base_bdevs_operational": 4, 00:18:12.662 "base_bdevs_list": [ 00:18:12.662 { 00:18:12.662 "name": "BaseBdev1", 00:18:12.662 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:12.662 "is_configured": true, 00:18:12.662 "data_offset": 0, 00:18:12.662 "data_size": 65536 00:18:12.662 }, 00:18:12.662 { 00:18:12.662 "name": null, 00:18:12.662 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:12.662 "is_configured": false, 00:18:12.662 "data_offset": 0, 00:18:12.662 "data_size": 65536 00:18:12.662 }, 00:18:12.662 { 00:18:12.662 "name": "BaseBdev3", 00:18:12.662 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:12.662 "is_configured": true, 00:18:12.662 "data_offset": 0, 00:18:12.662 "data_size": 65536 00:18:12.662 }, 00:18:12.662 { 00:18:12.662 "name": "BaseBdev4", 00:18:12.662 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:12.662 "is_configured": true, 00:18:12.662 "data_offset": 0, 00:18:12.662 "data_size": 65536 00:18:12.662 } 00:18:12.662 ] 00:18:12.662 }' 00:18:12.662 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.662 13:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.232 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.232 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:13.492 [2024-06-10 13:46:27.948303] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:13.492 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.753 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.753 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.753 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.753 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.753 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.753 13:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.753 13:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.753 "name": "Existed_Raid", 00:18:13.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.753 "strip_size_kb": 0, 00:18:13.753 "state": "configuring", 00:18:13.753 "raid_level": "raid1", 00:18:13.753 "superblock": false, 00:18:13.753 "num_base_bdevs": 4, 00:18:13.753 "num_base_bdevs_discovered": 2, 00:18:13.753 "num_base_bdevs_operational": 4, 00:18:13.753 "base_bdevs_list": [ 00:18:13.753 { 00:18:13.753 "name": null, 00:18:13.753 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:13.753 "is_configured": false, 00:18:13.753 "data_offset": 0, 00:18:13.753 "data_size": 65536 00:18:13.753 }, 00:18:13.753 { 00:18:13.753 "name": null, 00:18:13.753 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:13.753 "is_configured": false, 00:18:13.753 "data_offset": 0, 00:18:13.753 "data_size": 65536 00:18:13.753 }, 00:18:13.753 { 00:18:13.753 "name": "BaseBdev3", 00:18:13.753 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:13.753 "is_configured": true, 00:18:13.753 "data_offset": 0, 00:18:13.753 "data_size": 65536 00:18:13.753 }, 00:18:13.753 { 00:18:13.753 "name": "BaseBdev4", 00:18:13.753 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:13.753 "is_configured": true, 00:18:13.753 "data_offset": 0, 00:18:13.753 "data_size": 65536 00:18:13.753 } 00:18:13.753 ] 00:18:13.753 }' 00:18:13.753 13:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.753 13:46:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.323 13:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.323 13:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:14.582 13:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:14.582 13:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:14.842 [2024-06-10 13:46:29.101158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.842 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.103 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.103 "name": "Existed_Raid", 00:18:15.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.103 "strip_size_kb": 0, 00:18:15.103 "state": "configuring", 00:18:15.103 "raid_level": "raid1", 00:18:15.103 "superblock": false, 00:18:15.103 "num_base_bdevs": 4, 00:18:15.103 "num_base_bdevs_discovered": 3, 00:18:15.103 "num_base_bdevs_operational": 4, 00:18:15.103 "base_bdevs_list": [ 00:18:15.103 { 00:18:15.103 "name": null, 00:18:15.103 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:15.103 "is_configured": false, 00:18:15.103 "data_offset": 0, 00:18:15.103 "data_size": 65536 00:18:15.103 }, 00:18:15.103 { 00:18:15.103 "name": "BaseBdev2", 00:18:15.103 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:15.103 "is_configured": true, 00:18:15.103 "data_offset": 0, 00:18:15.103 "data_size": 65536 00:18:15.103 }, 00:18:15.103 { 00:18:15.103 "name": "BaseBdev3", 00:18:15.103 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:15.103 "is_configured": true, 00:18:15.103 "data_offset": 0, 00:18:15.103 "data_size": 65536 00:18:15.103 }, 00:18:15.103 { 00:18:15.103 "name": "BaseBdev4", 00:18:15.103 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:15.103 "is_configured": true, 00:18:15.103 "data_offset": 0, 00:18:15.103 "data_size": 65536 00:18:15.103 } 00:18:15.103 ] 00:18:15.103 }' 00:18:15.103 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.103 13:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.673 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.673 13:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:15.673 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:15.673 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.673 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:15.933 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 46cd3bff-a0ef-4eb6-9469-865b7994cc61 00:18:16.197 [2024-06-10 13:46:30.466094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:16.197 [2024-06-10 13:46:30.466117] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c117c0 00:18:16.197 [2024-06-10 13:46:30.466122] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:16.197 [2024-06-10 13:46:30.466288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c143b0 00:18:16.197 [2024-06-10 13:46:30.466392] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c117c0 00:18:16.197 [2024-06-10 13:46:30.466398] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c117c0 00:18:16.197 [2024-06-10 13:46:30.466526] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:16.197 NewBaseBdev 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:16.197 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:16.524 [ 00:18:16.524 { 00:18:16.524 "name": "NewBaseBdev", 00:18:16.524 "aliases": [ 00:18:16.524 "46cd3bff-a0ef-4eb6-9469-865b7994cc61" 00:18:16.524 ], 00:18:16.524 "product_name": "Malloc disk", 00:18:16.524 "block_size": 512, 00:18:16.524 "num_blocks": 65536, 00:18:16.524 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:16.524 "assigned_rate_limits": { 00:18:16.524 "rw_ios_per_sec": 0, 00:18:16.524 "rw_mbytes_per_sec": 0, 00:18:16.524 "r_mbytes_per_sec": 0, 00:18:16.524 "w_mbytes_per_sec": 0 00:18:16.524 }, 00:18:16.524 "claimed": true, 00:18:16.524 "claim_type": "exclusive_write", 00:18:16.524 "zoned": false, 00:18:16.524 "supported_io_types": { 00:18:16.524 "read": true, 00:18:16.524 "write": true, 00:18:16.524 "unmap": true, 00:18:16.524 "write_zeroes": true, 00:18:16.524 "flush": true, 00:18:16.524 "reset": true, 00:18:16.524 "compare": false, 00:18:16.524 "compare_and_write": false, 00:18:16.524 "abort": true, 00:18:16.524 "nvme_admin": false, 00:18:16.524 "nvme_io": false 00:18:16.524 }, 00:18:16.524 "memory_domains": [ 00:18:16.524 { 00:18:16.524 "dma_device_id": "system", 00:18:16.524 "dma_device_type": 1 00:18:16.524 }, 00:18:16.524 { 00:18:16.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.524 "dma_device_type": 2 00:18:16.524 } 00:18:16.524 ], 00:18:16.524 "driver_specific": {} 00:18:16.524 } 00:18:16.524 ] 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.524 13:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.793 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.793 "name": "Existed_Raid", 00:18:16.793 "uuid": "269fd102-966f-43c6-96e9-6642095b45a1", 00:18:16.793 "strip_size_kb": 0, 00:18:16.793 "state": "online", 00:18:16.793 "raid_level": "raid1", 00:18:16.793 "superblock": false, 00:18:16.793 "num_base_bdevs": 4, 00:18:16.793 "num_base_bdevs_discovered": 4, 00:18:16.793 "num_base_bdevs_operational": 4, 00:18:16.793 "base_bdevs_list": [ 00:18:16.793 { 00:18:16.793 "name": "NewBaseBdev", 00:18:16.793 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:16.793 "is_configured": true, 00:18:16.793 "data_offset": 0, 00:18:16.793 "data_size": 65536 00:18:16.793 }, 00:18:16.793 { 00:18:16.793 "name": "BaseBdev2", 00:18:16.793 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:16.793 "is_configured": true, 00:18:16.793 "data_offset": 0, 00:18:16.793 "data_size": 65536 00:18:16.793 }, 00:18:16.793 { 00:18:16.793 "name": "BaseBdev3", 00:18:16.793 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:16.793 "is_configured": true, 00:18:16.793 "data_offset": 0, 00:18:16.793 "data_size": 65536 00:18:16.793 }, 00:18:16.793 { 00:18:16.793 "name": "BaseBdev4", 00:18:16.793 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:16.793 "is_configured": true, 00:18:16.793 "data_offset": 0, 00:18:16.793 "data_size": 65536 00:18:16.793 } 00:18:16.793 ] 00:18:16.793 }' 00:18:16.793 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.793 13:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:17.364 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:17.623 [2024-06-10 13:46:31.849866] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.623 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:17.623 "name": "Existed_Raid", 00:18:17.623 "aliases": [ 00:18:17.623 "269fd102-966f-43c6-96e9-6642095b45a1" 00:18:17.623 ], 00:18:17.623 "product_name": "Raid Volume", 00:18:17.623 "block_size": 512, 00:18:17.623 "num_blocks": 65536, 00:18:17.623 "uuid": "269fd102-966f-43c6-96e9-6642095b45a1", 00:18:17.623 "assigned_rate_limits": { 00:18:17.623 "rw_ios_per_sec": 0, 00:18:17.623 "rw_mbytes_per_sec": 0, 00:18:17.623 "r_mbytes_per_sec": 0, 00:18:17.623 "w_mbytes_per_sec": 0 00:18:17.623 }, 00:18:17.623 "claimed": false, 00:18:17.623 "zoned": false, 00:18:17.623 "supported_io_types": { 00:18:17.623 "read": true, 00:18:17.623 "write": true, 00:18:17.623 "unmap": false, 00:18:17.623 "write_zeroes": true, 00:18:17.623 "flush": false, 00:18:17.624 "reset": true, 00:18:17.624 "compare": false, 00:18:17.624 "compare_and_write": false, 00:18:17.624 "abort": false, 00:18:17.624 "nvme_admin": false, 00:18:17.624 "nvme_io": false 00:18:17.624 }, 00:18:17.624 "memory_domains": [ 00:18:17.624 { 00:18:17.624 "dma_device_id": "system", 00:18:17.624 "dma_device_type": 1 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.624 "dma_device_type": 2 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "system", 00:18:17.624 "dma_device_type": 1 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.624 "dma_device_type": 2 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "system", 00:18:17.624 "dma_device_type": 1 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.624 "dma_device_type": 2 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "system", 00:18:17.624 "dma_device_type": 1 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.624 "dma_device_type": 2 00:18:17.624 } 00:18:17.624 ], 00:18:17.624 "driver_specific": { 00:18:17.624 "raid": { 00:18:17.624 "uuid": "269fd102-966f-43c6-96e9-6642095b45a1", 00:18:17.624 "strip_size_kb": 0, 00:18:17.624 "state": "online", 00:18:17.624 "raid_level": "raid1", 00:18:17.624 "superblock": false, 00:18:17.624 "num_base_bdevs": 4, 00:18:17.624 "num_base_bdevs_discovered": 4, 00:18:17.624 "num_base_bdevs_operational": 4, 00:18:17.624 "base_bdevs_list": [ 00:18:17.624 { 00:18:17.624 "name": "NewBaseBdev", 00:18:17.624 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:17.624 "is_configured": true, 00:18:17.624 "data_offset": 0, 00:18:17.624 "data_size": 65536 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "name": "BaseBdev2", 00:18:17.624 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:17.624 "is_configured": true, 00:18:17.624 "data_offset": 0, 00:18:17.624 "data_size": 65536 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "name": "BaseBdev3", 00:18:17.624 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:17.624 "is_configured": true, 00:18:17.624 "data_offset": 0, 00:18:17.624 "data_size": 65536 00:18:17.624 }, 00:18:17.624 { 00:18:17.624 "name": "BaseBdev4", 00:18:17.624 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:17.624 "is_configured": true, 00:18:17.624 "data_offset": 0, 00:18:17.624 "data_size": 65536 00:18:17.624 } 00:18:17.624 ] 00:18:17.624 } 00:18:17.624 } 00:18:17.624 }' 00:18:17.624 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:17.624 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:17.624 BaseBdev2 00:18:17.624 BaseBdev3 00:18:17.624 BaseBdev4' 00:18:17.624 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.624 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:17.624 13:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.883 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.883 "name": "NewBaseBdev", 00:18:17.883 "aliases": [ 00:18:17.883 "46cd3bff-a0ef-4eb6-9469-865b7994cc61" 00:18:17.883 ], 00:18:17.883 "product_name": "Malloc disk", 00:18:17.883 "block_size": 512, 00:18:17.883 "num_blocks": 65536, 00:18:17.884 "uuid": "46cd3bff-a0ef-4eb6-9469-865b7994cc61", 00:18:17.884 "assigned_rate_limits": { 00:18:17.884 "rw_ios_per_sec": 0, 00:18:17.884 "rw_mbytes_per_sec": 0, 00:18:17.884 "r_mbytes_per_sec": 0, 00:18:17.884 "w_mbytes_per_sec": 0 00:18:17.884 }, 00:18:17.884 "claimed": true, 00:18:17.884 "claim_type": "exclusive_write", 00:18:17.884 "zoned": false, 00:18:17.884 "supported_io_types": { 00:18:17.884 "read": true, 00:18:17.884 "write": true, 00:18:17.884 "unmap": true, 00:18:17.884 "write_zeroes": true, 00:18:17.884 "flush": true, 00:18:17.884 "reset": true, 00:18:17.884 "compare": false, 00:18:17.884 "compare_and_write": false, 00:18:17.884 "abort": true, 00:18:17.884 "nvme_admin": false, 00:18:17.884 "nvme_io": false 00:18:17.884 }, 00:18:17.884 "memory_domains": [ 00:18:17.884 { 00:18:17.884 "dma_device_id": "system", 00:18:17.884 "dma_device_type": 1 00:18:17.884 }, 00:18:17.884 { 00:18:17.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.884 "dma_device_type": 2 00:18:17.884 } 00:18:17.884 ], 00:18:17.884 "driver_specific": {} 00:18:17.884 }' 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.884 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:18.144 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.404 "name": "BaseBdev2", 00:18:18.404 "aliases": [ 00:18:18.404 "e8b0449c-a4c0-4418-b4b9-dc17d94f5356" 00:18:18.404 ], 00:18:18.404 "product_name": "Malloc disk", 00:18:18.404 "block_size": 512, 00:18:18.404 "num_blocks": 65536, 00:18:18.404 "uuid": "e8b0449c-a4c0-4418-b4b9-dc17d94f5356", 00:18:18.404 "assigned_rate_limits": { 00:18:18.404 "rw_ios_per_sec": 0, 00:18:18.404 "rw_mbytes_per_sec": 0, 00:18:18.404 "r_mbytes_per_sec": 0, 00:18:18.404 "w_mbytes_per_sec": 0 00:18:18.404 }, 00:18:18.404 "claimed": true, 00:18:18.404 "claim_type": "exclusive_write", 00:18:18.404 "zoned": false, 00:18:18.404 "supported_io_types": { 00:18:18.404 "read": true, 00:18:18.404 "write": true, 00:18:18.404 "unmap": true, 00:18:18.404 "write_zeroes": true, 00:18:18.404 "flush": true, 00:18:18.404 "reset": true, 00:18:18.404 "compare": false, 00:18:18.404 "compare_and_write": false, 00:18:18.404 "abort": true, 00:18:18.404 "nvme_admin": false, 00:18:18.404 "nvme_io": false 00:18:18.404 }, 00:18:18.404 "memory_domains": [ 00:18:18.404 { 00:18:18.404 "dma_device_id": "system", 00:18:18.404 "dma_device_type": 1 00:18:18.404 }, 00:18:18.404 { 00:18:18.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.404 "dma_device_type": 2 00:18:18.404 } 00:18:18.404 ], 00:18:18.404 "driver_specific": {} 00:18:18.404 }' 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.404 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:18.665 13:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.925 "name": "BaseBdev3", 00:18:18.925 "aliases": [ 00:18:18.925 "10774db7-c457-4104-a174-e4a5857c1e34" 00:18:18.925 ], 00:18:18.925 "product_name": "Malloc disk", 00:18:18.925 "block_size": 512, 00:18:18.925 "num_blocks": 65536, 00:18:18.925 "uuid": "10774db7-c457-4104-a174-e4a5857c1e34", 00:18:18.925 "assigned_rate_limits": { 00:18:18.925 "rw_ios_per_sec": 0, 00:18:18.925 "rw_mbytes_per_sec": 0, 00:18:18.925 "r_mbytes_per_sec": 0, 00:18:18.925 "w_mbytes_per_sec": 0 00:18:18.925 }, 00:18:18.925 "claimed": true, 00:18:18.925 "claim_type": "exclusive_write", 00:18:18.925 "zoned": false, 00:18:18.925 "supported_io_types": { 00:18:18.925 "read": true, 00:18:18.925 "write": true, 00:18:18.925 "unmap": true, 00:18:18.925 "write_zeroes": true, 00:18:18.925 "flush": true, 00:18:18.925 "reset": true, 00:18:18.925 "compare": false, 00:18:18.925 "compare_and_write": false, 00:18:18.925 "abort": true, 00:18:18.925 "nvme_admin": false, 00:18:18.925 "nvme_io": false 00:18:18.925 }, 00:18:18.925 "memory_domains": [ 00:18:18.925 { 00:18:18.925 "dma_device_id": "system", 00:18:18.925 "dma_device_type": 1 00:18:18.925 }, 00:18:18.925 { 00:18:18.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.925 "dma_device_type": 2 00:18:18.925 } 00:18:18.925 ], 00:18:18.925 "driver_specific": {} 00:18:18.925 }' 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.925 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:19.185 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.445 "name": "BaseBdev4", 00:18:19.445 "aliases": [ 00:18:19.445 "cee39ba2-1a50-4a74-ba14-ad56023adcf7" 00:18:19.445 ], 00:18:19.445 "product_name": "Malloc disk", 00:18:19.445 "block_size": 512, 00:18:19.445 "num_blocks": 65536, 00:18:19.445 "uuid": "cee39ba2-1a50-4a74-ba14-ad56023adcf7", 00:18:19.445 "assigned_rate_limits": { 00:18:19.445 "rw_ios_per_sec": 0, 00:18:19.445 "rw_mbytes_per_sec": 0, 00:18:19.445 "r_mbytes_per_sec": 0, 00:18:19.445 "w_mbytes_per_sec": 0 00:18:19.445 }, 00:18:19.445 "claimed": true, 00:18:19.445 "claim_type": "exclusive_write", 00:18:19.445 "zoned": false, 00:18:19.445 "supported_io_types": { 00:18:19.445 "read": true, 00:18:19.445 "write": true, 00:18:19.445 "unmap": true, 00:18:19.445 "write_zeroes": true, 00:18:19.445 "flush": true, 00:18:19.445 "reset": true, 00:18:19.445 "compare": false, 00:18:19.445 "compare_and_write": false, 00:18:19.445 "abort": true, 00:18:19.445 "nvme_admin": false, 00:18:19.445 "nvme_io": false 00:18:19.445 }, 00:18:19.445 "memory_domains": [ 00:18:19.445 { 00:18:19.445 "dma_device_id": "system", 00:18:19.445 "dma_device_type": 1 00:18:19.445 }, 00:18:19.445 { 00:18:19.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.445 "dma_device_type": 2 00:18:19.445 } 00:18:19.445 ], 00:18:19.445 "driver_specific": {} 00:18:19.445 }' 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.445 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.706 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.706 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.706 13:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.706 13:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.706 13:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.706 13:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:19.966 [2024-06-10 13:46:34.255736] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:19.966 [2024-06-10 13:46:34.255752] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:19.966 [2024-06-10 13:46:34.255791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:19.966 [2024-06-10 13:46:34.256017] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:19.966 [2024-06-10 13:46:34.256025] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c117c0 name Existed_Raid, state offline 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1600152 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 1600152 ']' 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 1600152 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1600152 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1600152' 00:18:19.966 killing process with pid 1600152 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 1600152 00:18:19.966 [2024-06-10 13:46:34.323752] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:19.966 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 1600152 00:18:19.966 [2024-06-10 13:46:34.345054] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:20.227 00:18:20.227 real 0m28.173s 00:18:20.227 user 0m52.856s 00:18:20.227 sys 0m4.095s 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.227 ************************************ 00:18:20.227 END TEST raid_state_function_test 00:18:20.227 ************************************ 00:18:20.227 13:46:34 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:18:20.227 13:46:34 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:18:20.227 13:46:34 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:20.227 13:46:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:20.227 ************************************ 00:18:20.227 START TEST raid_state_function_test_sb 00:18:20.227 ************************************ 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 true 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1606210 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1606210' 00:18:20.227 Process raid pid: 1606210 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1606210 /var/tmp/spdk-raid.sock 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1606210 ']' 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:20.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:20.227 13:46:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.227 [2024-06-10 13:46:34.620112] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:18:20.227 [2024-06-10 13:46:34.620169] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:20.487 [2024-06-10 13:46:34.712406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:20.487 [2024-06-10 13:46:34.780207] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:18:20.487 [2024-06-10 13:46:34.828323] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:20.487 [2024-06-10 13:46:34.828346] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:21.058 13:46:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:21.058 13:46:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:18:21.058 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:21.319 [2024-06-10 13:46:35.656309] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:21.319 [2024-06-10 13:46:35.656338] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:21.319 [2024-06-10 13:46:35.656344] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:21.319 [2024-06-10 13:46:35.656350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:21.319 [2024-06-10 13:46:35.656356] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:21.319 [2024-06-10 13:46:35.656361] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:21.319 [2024-06-10 13:46:35.656366] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:21.319 [2024-06-10 13:46:35.656372] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.319 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:21.579 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.579 "name": "Existed_Raid", 00:18:21.579 "uuid": "e7d6159e-2956-4284-918a-2208e863bb49", 00:18:21.579 "strip_size_kb": 0, 00:18:21.579 "state": "configuring", 00:18:21.579 "raid_level": "raid1", 00:18:21.579 "superblock": true, 00:18:21.579 "num_base_bdevs": 4, 00:18:21.579 "num_base_bdevs_discovered": 0, 00:18:21.579 "num_base_bdevs_operational": 4, 00:18:21.579 "base_bdevs_list": [ 00:18:21.579 { 00:18:21.579 "name": "BaseBdev1", 00:18:21.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.579 "is_configured": false, 00:18:21.579 "data_offset": 0, 00:18:21.579 "data_size": 0 00:18:21.579 }, 00:18:21.579 { 00:18:21.579 "name": "BaseBdev2", 00:18:21.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.579 "is_configured": false, 00:18:21.579 "data_offset": 0, 00:18:21.579 "data_size": 0 00:18:21.580 }, 00:18:21.580 { 00:18:21.580 "name": "BaseBdev3", 00:18:21.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.580 "is_configured": false, 00:18:21.580 "data_offset": 0, 00:18:21.580 "data_size": 0 00:18:21.580 }, 00:18:21.580 { 00:18:21.580 "name": "BaseBdev4", 00:18:21.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.580 "is_configured": false, 00:18:21.580 "data_offset": 0, 00:18:21.580 "data_size": 0 00:18:21.580 } 00:18:21.580 ] 00:18:21.580 }' 00:18:21.580 13:46:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.580 13:46:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:22.150 13:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:22.150 [2024-06-10 13:46:36.606596] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:22.150 [2024-06-10 13:46:36.606614] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2086760 name Existed_Raid, state configuring 00:18:22.151 13:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:22.411 [2024-06-10 13:46:36.807132] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:22.411 [2024-06-10 13:46:36.807150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:22.411 [2024-06-10 13:46:36.807155] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:22.411 [2024-06-10 13:46:36.807166] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:22.411 [2024-06-10 13:46:36.807171] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:22.411 [2024-06-10 13:46:36.807177] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:22.411 [2024-06-10 13:46:36.807182] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:22.411 [2024-06-10 13:46:36.807188] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:22.412 13:46:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:22.672 [2024-06-10 13:46:37.014527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:22.672 BaseBdev1 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:22.672 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:22.931 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:23.191 [ 00:18:23.191 { 00:18:23.191 "name": "BaseBdev1", 00:18:23.191 "aliases": [ 00:18:23.191 "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c" 00:18:23.191 ], 00:18:23.191 "product_name": "Malloc disk", 00:18:23.191 "block_size": 512, 00:18:23.191 "num_blocks": 65536, 00:18:23.191 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:23.191 "assigned_rate_limits": { 00:18:23.191 "rw_ios_per_sec": 0, 00:18:23.191 "rw_mbytes_per_sec": 0, 00:18:23.191 "r_mbytes_per_sec": 0, 00:18:23.191 "w_mbytes_per_sec": 0 00:18:23.191 }, 00:18:23.191 "claimed": true, 00:18:23.191 "claim_type": "exclusive_write", 00:18:23.191 "zoned": false, 00:18:23.191 "supported_io_types": { 00:18:23.191 "read": true, 00:18:23.191 "write": true, 00:18:23.191 "unmap": true, 00:18:23.191 "write_zeroes": true, 00:18:23.191 "flush": true, 00:18:23.191 "reset": true, 00:18:23.191 "compare": false, 00:18:23.191 "compare_and_write": false, 00:18:23.191 "abort": true, 00:18:23.191 "nvme_admin": false, 00:18:23.191 "nvme_io": false 00:18:23.191 }, 00:18:23.191 "memory_domains": [ 00:18:23.191 { 00:18:23.191 "dma_device_id": "system", 00:18:23.191 "dma_device_type": 1 00:18:23.191 }, 00:18:23.191 { 00:18:23.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.191 "dma_device_type": 2 00:18:23.191 } 00:18:23.191 ], 00:18:23.191 "driver_specific": {} 00:18:23.191 } 00:18:23.191 ] 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.191 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.192 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:23.192 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:23.192 "name": "Existed_Raid", 00:18:23.192 "uuid": "9bb8aa27-ed3f-461b-b7c1-99a57c5ad695", 00:18:23.192 "strip_size_kb": 0, 00:18:23.192 "state": "configuring", 00:18:23.192 "raid_level": "raid1", 00:18:23.192 "superblock": true, 00:18:23.192 "num_base_bdevs": 4, 00:18:23.192 "num_base_bdevs_discovered": 1, 00:18:23.192 "num_base_bdevs_operational": 4, 00:18:23.192 "base_bdevs_list": [ 00:18:23.192 { 00:18:23.192 "name": "BaseBdev1", 00:18:23.192 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:23.192 "is_configured": true, 00:18:23.192 "data_offset": 2048, 00:18:23.192 "data_size": 63488 00:18:23.192 }, 00:18:23.192 { 00:18:23.192 "name": "BaseBdev2", 00:18:23.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.192 "is_configured": false, 00:18:23.192 "data_offset": 0, 00:18:23.192 "data_size": 0 00:18:23.192 }, 00:18:23.192 { 00:18:23.192 "name": "BaseBdev3", 00:18:23.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.192 "is_configured": false, 00:18:23.192 "data_offset": 0, 00:18:23.192 "data_size": 0 00:18:23.192 }, 00:18:23.192 { 00:18:23.192 "name": "BaseBdev4", 00:18:23.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.192 "is_configured": false, 00:18:23.192 "data_offset": 0, 00:18:23.192 "data_size": 0 00:18:23.192 } 00:18:23.192 ] 00:18:23.192 }' 00:18:23.192 13:46:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:23.192 13:46:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.761 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:24.020 [2024-06-10 13:46:38.341900] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:24.020 [2024-06-10 13:46:38.341931] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2085fd0 name Existed_Raid, state configuring 00:18:24.020 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:24.281 [2024-06-10 13:46:38.546457] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:24.281 [2024-06-10 13:46:38.547676] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:24.281 [2024-06-10 13:46:38.547701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:24.281 [2024-06-10 13:46:38.547707] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:24.281 [2024-06-10 13:46:38.547713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:24.281 [2024-06-10 13:46:38.547718] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:24.281 [2024-06-10 13:46:38.547724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.281 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.541 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.541 "name": "Existed_Raid", 00:18:24.541 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:24.541 "strip_size_kb": 0, 00:18:24.541 "state": "configuring", 00:18:24.541 "raid_level": "raid1", 00:18:24.541 "superblock": true, 00:18:24.541 "num_base_bdevs": 4, 00:18:24.541 "num_base_bdevs_discovered": 1, 00:18:24.541 "num_base_bdevs_operational": 4, 00:18:24.541 "base_bdevs_list": [ 00:18:24.541 { 00:18:24.541 "name": "BaseBdev1", 00:18:24.541 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:24.541 "is_configured": true, 00:18:24.541 "data_offset": 2048, 00:18:24.541 "data_size": 63488 00:18:24.541 }, 00:18:24.541 { 00:18:24.541 "name": "BaseBdev2", 00:18:24.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.541 "is_configured": false, 00:18:24.541 "data_offset": 0, 00:18:24.541 "data_size": 0 00:18:24.541 }, 00:18:24.541 { 00:18:24.541 "name": "BaseBdev3", 00:18:24.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.541 "is_configured": false, 00:18:24.541 "data_offset": 0, 00:18:24.541 "data_size": 0 00:18:24.541 }, 00:18:24.541 { 00:18:24.541 "name": "BaseBdev4", 00:18:24.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.541 "is_configured": false, 00:18:24.541 "data_offset": 0, 00:18:24.541 "data_size": 0 00:18:24.541 } 00:18:24.541 ] 00:18:24.541 }' 00:18:24.541 13:46:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.541 13:46:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:25.111 [2024-06-10 13:46:39.509962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:25.111 BaseBdev2 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:25.111 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.371 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:25.631 [ 00:18:25.631 { 00:18:25.631 "name": "BaseBdev2", 00:18:25.631 "aliases": [ 00:18:25.631 "66285652-3f8c-4f44-9a63-db66a5860efc" 00:18:25.631 ], 00:18:25.631 "product_name": "Malloc disk", 00:18:25.631 "block_size": 512, 00:18:25.631 "num_blocks": 65536, 00:18:25.631 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:25.631 "assigned_rate_limits": { 00:18:25.631 "rw_ios_per_sec": 0, 00:18:25.631 "rw_mbytes_per_sec": 0, 00:18:25.631 "r_mbytes_per_sec": 0, 00:18:25.631 "w_mbytes_per_sec": 0 00:18:25.631 }, 00:18:25.631 "claimed": true, 00:18:25.631 "claim_type": "exclusive_write", 00:18:25.631 "zoned": false, 00:18:25.631 "supported_io_types": { 00:18:25.631 "read": true, 00:18:25.631 "write": true, 00:18:25.631 "unmap": true, 00:18:25.631 "write_zeroes": true, 00:18:25.631 "flush": true, 00:18:25.631 "reset": true, 00:18:25.631 "compare": false, 00:18:25.631 "compare_and_write": false, 00:18:25.631 "abort": true, 00:18:25.631 "nvme_admin": false, 00:18:25.631 "nvme_io": false 00:18:25.631 }, 00:18:25.631 "memory_domains": [ 00:18:25.631 { 00:18:25.631 "dma_device_id": "system", 00:18:25.631 "dma_device_type": 1 00:18:25.631 }, 00:18:25.631 { 00:18:25.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.631 "dma_device_type": 2 00:18:25.631 } 00:18:25.631 ], 00:18:25.631 "driver_specific": {} 00:18:25.631 } 00:18:25.631 ] 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.631 13:46:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.891 13:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.891 "name": "Existed_Raid", 00:18:25.891 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:25.891 "strip_size_kb": 0, 00:18:25.891 "state": "configuring", 00:18:25.891 "raid_level": "raid1", 00:18:25.891 "superblock": true, 00:18:25.891 "num_base_bdevs": 4, 00:18:25.891 "num_base_bdevs_discovered": 2, 00:18:25.891 "num_base_bdevs_operational": 4, 00:18:25.891 "base_bdevs_list": [ 00:18:25.891 { 00:18:25.891 "name": "BaseBdev1", 00:18:25.891 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:25.891 "is_configured": true, 00:18:25.891 "data_offset": 2048, 00:18:25.891 "data_size": 63488 00:18:25.891 }, 00:18:25.891 { 00:18:25.891 "name": "BaseBdev2", 00:18:25.891 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:25.891 "is_configured": true, 00:18:25.891 "data_offset": 2048, 00:18:25.891 "data_size": 63488 00:18:25.891 }, 00:18:25.891 { 00:18:25.891 "name": "BaseBdev3", 00:18:25.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.891 "is_configured": false, 00:18:25.891 "data_offset": 0, 00:18:25.891 "data_size": 0 00:18:25.891 }, 00:18:25.891 { 00:18:25.891 "name": "BaseBdev4", 00:18:25.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.891 "is_configured": false, 00:18:25.891 "data_offset": 0, 00:18:25.891 "data_size": 0 00:18:25.891 } 00:18:25.891 ] 00:18:25.891 }' 00:18:25.891 13:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.891 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:26.462 [2024-06-10 13:46:40.854475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:26.462 BaseBdev3 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:26.462 13:46:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:26.722 13:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:26.983 [ 00:18:26.983 { 00:18:26.983 "name": "BaseBdev3", 00:18:26.983 "aliases": [ 00:18:26.983 "69406e98-11cd-4217-b99b-3809b7b61734" 00:18:26.983 ], 00:18:26.983 "product_name": "Malloc disk", 00:18:26.983 "block_size": 512, 00:18:26.983 "num_blocks": 65536, 00:18:26.983 "uuid": "69406e98-11cd-4217-b99b-3809b7b61734", 00:18:26.983 "assigned_rate_limits": { 00:18:26.983 "rw_ios_per_sec": 0, 00:18:26.983 "rw_mbytes_per_sec": 0, 00:18:26.983 "r_mbytes_per_sec": 0, 00:18:26.983 "w_mbytes_per_sec": 0 00:18:26.983 }, 00:18:26.983 "claimed": true, 00:18:26.983 "claim_type": "exclusive_write", 00:18:26.983 "zoned": false, 00:18:26.983 "supported_io_types": { 00:18:26.983 "read": true, 00:18:26.983 "write": true, 00:18:26.983 "unmap": true, 00:18:26.983 "write_zeroes": true, 00:18:26.983 "flush": true, 00:18:26.983 "reset": true, 00:18:26.983 "compare": false, 00:18:26.983 "compare_and_write": false, 00:18:26.983 "abort": true, 00:18:26.983 "nvme_admin": false, 00:18:26.983 "nvme_io": false 00:18:26.983 }, 00:18:26.983 "memory_domains": [ 00:18:26.983 { 00:18:26.983 "dma_device_id": "system", 00:18:26.983 "dma_device_type": 1 00:18:26.983 }, 00:18:26.983 { 00:18:26.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.983 "dma_device_type": 2 00:18:26.983 } 00:18:26.983 ], 00:18:26.983 "driver_specific": {} 00:18:26.983 } 00:18:26.983 ] 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.983 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.244 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.244 "name": "Existed_Raid", 00:18:27.244 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:27.244 "strip_size_kb": 0, 00:18:27.244 "state": "configuring", 00:18:27.244 "raid_level": "raid1", 00:18:27.244 "superblock": true, 00:18:27.244 "num_base_bdevs": 4, 00:18:27.244 "num_base_bdevs_discovered": 3, 00:18:27.244 "num_base_bdevs_operational": 4, 00:18:27.244 "base_bdevs_list": [ 00:18:27.244 { 00:18:27.244 "name": "BaseBdev1", 00:18:27.244 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:27.244 "is_configured": true, 00:18:27.244 "data_offset": 2048, 00:18:27.244 "data_size": 63488 00:18:27.244 }, 00:18:27.244 { 00:18:27.244 "name": "BaseBdev2", 00:18:27.244 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:27.244 "is_configured": true, 00:18:27.244 "data_offset": 2048, 00:18:27.244 "data_size": 63488 00:18:27.244 }, 00:18:27.244 { 00:18:27.244 "name": "BaseBdev3", 00:18:27.244 "uuid": "69406e98-11cd-4217-b99b-3809b7b61734", 00:18:27.244 "is_configured": true, 00:18:27.244 "data_offset": 2048, 00:18:27.244 "data_size": 63488 00:18:27.244 }, 00:18:27.244 { 00:18:27.244 "name": "BaseBdev4", 00:18:27.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.244 "is_configured": false, 00:18:27.244 "data_offset": 0, 00:18:27.244 "data_size": 0 00:18:27.244 } 00:18:27.244 ] 00:18:27.244 }' 00:18:27.244 13:46:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.244 13:46:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:27.814 [2024-06-10 13:46:42.211032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:27.814 [2024-06-10 13:46:42.211158] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2087030 00:18:27.814 [2024-06-10 13:46:42.211172] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:27.814 [2024-06-10 13:46:42.211317] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223a250 00:18:27.814 [2024-06-10 13:46:42.211420] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2087030 00:18:27.814 [2024-06-10 13:46:42.211426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2087030 00:18:27.814 [2024-06-10 13:46:42.211498] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:27.814 BaseBdev4 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:27.814 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.075 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:28.335 [ 00:18:28.335 { 00:18:28.335 "name": "BaseBdev4", 00:18:28.335 "aliases": [ 00:18:28.335 "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7" 00:18:28.335 ], 00:18:28.335 "product_name": "Malloc disk", 00:18:28.335 "block_size": 512, 00:18:28.335 "num_blocks": 65536, 00:18:28.335 "uuid": "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7", 00:18:28.335 "assigned_rate_limits": { 00:18:28.335 "rw_ios_per_sec": 0, 00:18:28.335 "rw_mbytes_per_sec": 0, 00:18:28.335 "r_mbytes_per_sec": 0, 00:18:28.335 "w_mbytes_per_sec": 0 00:18:28.335 }, 00:18:28.335 "claimed": true, 00:18:28.335 "claim_type": "exclusive_write", 00:18:28.335 "zoned": false, 00:18:28.335 "supported_io_types": { 00:18:28.335 "read": true, 00:18:28.335 "write": true, 00:18:28.335 "unmap": true, 00:18:28.335 "write_zeroes": true, 00:18:28.335 "flush": true, 00:18:28.335 "reset": true, 00:18:28.335 "compare": false, 00:18:28.335 "compare_and_write": false, 00:18:28.335 "abort": true, 00:18:28.335 "nvme_admin": false, 00:18:28.335 "nvme_io": false 00:18:28.335 }, 00:18:28.335 "memory_domains": [ 00:18:28.335 { 00:18:28.335 "dma_device_id": "system", 00:18:28.335 "dma_device_type": 1 00:18:28.335 }, 00:18:28.335 { 00:18:28.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.335 "dma_device_type": 2 00:18:28.335 } 00:18:28.335 ], 00:18:28.335 "driver_specific": {} 00:18:28.335 } 00:18:28.335 ] 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.335 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.336 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.336 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.336 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.336 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.596 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.596 "name": "Existed_Raid", 00:18:28.596 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:28.596 "strip_size_kb": 0, 00:18:28.596 "state": "online", 00:18:28.596 "raid_level": "raid1", 00:18:28.596 "superblock": true, 00:18:28.596 "num_base_bdevs": 4, 00:18:28.596 "num_base_bdevs_discovered": 4, 00:18:28.596 "num_base_bdevs_operational": 4, 00:18:28.596 "base_bdevs_list": [ 00:18:28.596 { 00:18:28.596 "name": "BaseBdev1", 00:18:28.596 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:28.596 "is_configured": true, 00:18:28.596 "data_offset": 2048, 00:18:28.596 "data_size": 63488 00:18:28.596 }, 00:18:28.596 { 00:18:28.596 "name": "BaseBdev2", 00:18:28.596 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:28.596 "is_configured": true, 00:18:28.596 "data_offset": 2048, 00:18:28.596 "data_size": 63488 00:18:28.596 }, 00:18:28.596 { 00:18:28.596 "name": "BaseBdev3", 00:18:28.596 "uuid": "69406e98-11cd-4217-b99b-3809b7b61734", 00:18:28.596 "is_configured": true, 00:18:28.596 "data_offset": 2048, 00:18:28.596 "data_size": 63488 00:18:28.596 }, 00:18:28.596 { 00:18:28.596 "name": "BaseBdev4", 00:18:28.596 "uuid": "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7", 00:18:28.596 "is_configured": true, 00:18:28.596 "data_offset": 2048, 00:18:28.596 "data_size": 63488 00:18:28.596 } 00:18:28.596 ] 00:18:28.596 }' 00:18:28.596 13:46:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.596 13:46:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:29.167 [2024-06-10 13:46:43.558722] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:29.167 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:29.167 "name": "Existed_Raid", 00:18:29.167 "aliases": [ 00:18:29.167 "371ea668-02d6-45d1-bc80-d89880503c97" 00:18:29.167 ], 00:18:29.167 "product_name": "Raid Volume", 00:18:29.167 "block_size": 512, 00:18:29.167 "num_blocks": 63488, 00:18:29.167 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:29.167 "assigned_rate_limits": { 00:18:29.167 "rw_ios_per_sec": 0, 00:18:29.167 "rw_mbytes_per_sec": 0, 00:18:29.167 "r_mbytes_per_sec": 0, 00:18:29.167 "w_mbytes_per_sec": 0 00:18:29.167 }, 00:18:29.167 "claimed": false, 00:18:29.167 "zoned": false, 00:18:29.167 "supported_io_types": { 00:18:29.167 "read": true, 00:18:29.167 "write": true, 00:18:29.167 "unmap": false, 00:18:29.167 "write_zeroes": true, 00:18:29.167 "flush": false, 00:18:29.167 "reset": true, 00:18:29.167 "compare": false, 00:18:29.167 "compare_and_write": false, 00:18:29.167 "abort": false, 00:18:29.167 "nvme_admin": false, 00:18:29.167 "nvme_io": false 00:18:29.167 }, 00:18:29.167 "memory_domains": [ 00:18:29.167 { 00:18:29.167 "dma_device_id": "system", 00:18:29.167 "dma_device_type": 1 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.167 "dma_device_type": 2 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "system", 00:18:29.167 "dma_device_type": 1 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.167 "dma_device_type": 2 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "system", 00:18:29.167 "dma_device_type": 1 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.167 "dma_device_type": 2 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "system", 00:18:29.167 "dma_device_type": 1 00:18:29.167 }, 00:18:29.167 { 00:18:29.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.167 "dma_device_type": 2 00:18:29.167 } 00:18:29.167 ], 00:18:29.167 "driver_specific": { 00:18:29.167 "raid": { 00:18:29.167 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:29.167 "strip_size_kb": 0, 00:18:29.167 "state": "online", 00:18:29.167 "raid_level": "raid1", 00:18:29.167 "superblock": true, 00:18:29.167 "num_base_bdevs": 4, 00:18:29.167 "num_base_bdevs_discovered": 4, 00:18:29.167 "num_base_bdevs_operational": 4, 00:18:29.168 "base_bdevs_list": [ 00:18:29.168 { 00:18:29.168 "name": "BaseBdev1", 00:18:29.168 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:29.168 "is_configured": true, 00:18:29.168 "data_offset": 2048, 00:18:29.168 "data_size": 63488 00:18:29.168 }, 00:18:29.168 { 00:18:29.168 "name": "BaseBdev2", 00:18:29.168 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:29.168 "is_configured": true, 00:18:29.168 "data_offset": 2048, 00:18:29.168 "data_size": 63488 00:18:29.168 }, 00:18:29.168 { 00:18:29.168 "name": "BaseBdev3", 00:18:29.168 "uuid": "69406e98-11cd-4217-b99b-3809b7b61734", 00:18:29.168 "is_configured": true, 00:18:29.168 "data_offset": 2048, 00:18:29.168 "data_size": 63488 00:18:29.168 }, 00:18:29.168 { 00:18:29.168 "name": "BaseBdev4", 00:18:29.168 "uuid": "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7", 00:18:29.168 "is_configured": true, 00:18:29.168 "data_offset": 2048, 00:18:29.168 "data_size": 63488 00:18:29.168 } 00:18:29.168 ] 00:18:29.168 } 00:18:29.168 } 00:18:29.168 }' 00:18:29.168 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:29.168 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:29.168 BaseBdev2 00:18:29.168 BaseBdev3 00:18:29.168 BaseBdev4' 00:18:29.168 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.168 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:29.168 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.428 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.428 "name": "BaseBdev1", 00:18:29.428 "aliases": [ 00:18:29.428 "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c" 00:18:29.428 ], 00:18:29.428 "product_name": "Malloc disk", 00:18:29.428 "block_size": 512, 00:18:29.428 "num_blocks": 65536, 00:18:29.428 "uuid": "b6e2d01a-0b3b-4d34-82c4-7f7a6f7ab28c", 00:18:29.428 "assigned_rate_limits": { 00:18:29.428 "rw_ios_per_sec": 0, 00:18:29.428 "rw_mbytes_per_sec": 0, 00:18:29.428 "r_mbytes_per_sec": 0, 00:18:29.428 "w_mbytes_per_sec": 0 00:18:29.428 }, 00:18:29.428 "claimed": true, 00:18:29.428 "claim_type": "exclusive_write", 00:18:29.428 "zoned": false, 00:18:29.428 "supported_io_types": { 00:18:29.428 "read": true, 00:18:29.428 "write": true, 00:18:29.428 "unmap": true, 00:18:29.428 "write_zeroes": true, 00:18:29.428 "flush": true, 00:18:29.428 "reset": true, 00:18:29.428 "compare": false, 00:18:29.428 "compare_and_write": false, 00:18:29.428 "abort": true, 00:18:29.428 "nvme_admin": false, 00:18:29.428 "nvme_io": false 00:18:29.428 }, 00:18:29.428 "memory_domains": [ 00:18:29.428 { 00:18:29.428 "dma_device_id": "system", 00:18:29.428 "dma_device_type": 1 00:18:29.428 }, 00:18:29.428 { 00:18:29.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.428 "dma_device_type": 2 00:18:29.428 } 00:18:29.428 ], 00:18:29.428 "driver_specific": {} 00:18:29.428 }' 00:18:29.428 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.428 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.689 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.689 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.689 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.689 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.690 13:46:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:29.690 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.950 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.950 "name": "BaseBdev2", 00:18:29.950 "aliases": [ 00:18:29.950 "66285652-3f8c-4f44-9a63-db66a5860efc" 00:18:29.950 ], 00:18:29.950 "product_name": "Malloc disk", 00:18:29.950 "block_size": 512, 00:18:29.950 "num_blocks": 65536, 00:18:29.950 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:29.950 "assigned_rate_limits": { 00:18:29.950 "rw_ios_per_sec": 0, 00:18:29.950 "rw_mbytes_per_sec": 0, 00:18:29.950 "r_mbytes_per_sec": 0, 00:18:29.950 "w_mbytes_per_sec": 0 00:18:29.950 }, 00:18:29.950 "claimed": true, 00:18:29.950 "claim_type": "exclusive_write", 00:18:29.950 "zoned": false, 00:18:29.950 "supported_io_types": { 00:18:29.950 "read": true, 00:18:29.950 "write": true, 00:18:29.950 "unmap": true, 00:18:29.950 "write_zeroes": true, 00:18:29.950 "flush": true, 00:18:29.950 "reset": true, 00:18:29.950 "compare": false, 00:18:29.950 "compare_and_write": false, 00:18:29.950 "abort": true, 00:18:29.950 "nvme_admin": false, 00:18:29.950 "nvme_io": false 00:18:29.950 }, 00:18:29.950 "memory_domains": [ 00:18:29.950 { 00:18:29.950 "dma_device_id": "system", 00:18:29.950 "dma_device_type": 1 00:18:29.950 }, 00:18:29.950 { 00:18:29.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.950 "dma_device_type": 2 00:18:29.950 } 00:18:29.950 ], 00:18:29.950 "driver_specific": {} 00:18:29.950 }' 00:18:29.950 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.950 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.209 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.468 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.468 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.468 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:30.468 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.468 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.468 "name": "BaseBdev3", 00:18:30.468 "aliases": [ 00:18:30.468 "69406e98-11cd-4217-b99b-3809b7b61734" 00:18:30.468 ], 00:18:30.468 "product_name": "Malloc disk", 00:18:30.468 "block_size": 512, 00:18:30.468 "num_blocks": 65536, 00:18:30.468 "uuid": "69406e98-11cd-4217-b99b-3809b7b61734", 00:18:30.468 "assigned_rate_limits": { 00:18:30.468 "rw_ios_per_sec": 0, 00:18:30.468 "rw_mbytes_per_sec": 0, 00:18:30.468 "r_mbytes_per_sec": 0, 00:18:30.468 "w_mbytes_per_sec": 0 00:18:30.468 }, 00:18:30.468 "claimed": true, 00:18:30.468 "claim_type": "exclusive_write", 00:18:30.468 "zoned": false, 00:18:30.468 "supported_io_types": { 00:18:30.468 "read": true, 00:18:30.468 "write": true, 00:18:30.468 "unmap": true, 00:18:30.468 "write_zeroes": true, 00:18:30.468 "flush": true, 00:18:30.468 "reset": true, 00:18:30.468 "compare": false, 00:18:30.468 "compare_and_write": false, 00:18:30.468 "abort": true, 00:18:30.468 "nvme_admin": false, 00:18:30.468 "nvme_io": false 00:18:30.468 }, 00:18:30.468 "memory_domains": [ 00:18:30.468 { 00:18:30.468 "dma_device_id": "system", 00:18:30.468 "dma_device_type": 1 00:18:30.468 }, 00:18:30.469 { 00:18:30.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.469 "dma_device_type": 2 00:18:30.469 } 00:18:30.469 ], 00:18:30.469 "driver_specific": {} 00:18:30.469 }' 00:18:30.469 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.469 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.728 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.728 13:46:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.728 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.728 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.728 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.729 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.729 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.729 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.989 "name": "BaseBdev4", 00:18:30.989 "aliases": [ 00:18:30.989 "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7" 00:18:30.989 ], 00:18:30.989 "product_name": "Malloc disk", 00:18:30.989 "block_size": 512, 00:18:30.989 "num_blocks": 65536, 00:18:30.989 "uuid": "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7", 00:18:30.989 "assigned_rate_limits": { 00:18:30.989 "rw_ios_per_sec": 0, 00:18:30.989 "rw_mbytes_per_sec": 0, 00:18:30.989 "r_mbytes_per_sec": 0, 00:18:30.989 "w_mbytes_per_sec": 0 00:18:30.989 }, 00:18:30.989 "claimed": true, 00:18:30.989 "claim_type": "exclusive_write", 00:18:30.989 "zoned": false, 00:18:30.989 "supported_io_types": { 00:18:30.989 "read": true, 00:18:30.989 "write": true, 00:18:30.989 "unmap": true, 00:18:30.989 "write_zeroes": true, 00:18:30.989 "flush": true, 00:18:30.989 "reset": true, 00:18:30.989 "compare": false, 00:18:30.989 "compare_and_write": false, 00:18:30.989 "abort": true, 00:18:30.989 "nvme_admin": false, 00:18:30.989 "nvme_io": false 00:18:30.989 }, 00:18:30.989 "memory_domains": [ 00:18:30.989 { 00:18:30.989 "dma_device_id": "system", 00:18:30.989 "dma_device_type": 1 00:18:30.989 }, 00:18:30.989 { 00:18:30.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.989 "dma_device_type": 2 00:18:30.989 } 00:18:30.989 ], 00:18:30.989 "driver_specific": {} 00:18:30.989 }' 00:18:30.989 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.248 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.508 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.508 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.508 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:31.508 [2024-06-10 13:46:45.972633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.768 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.769 13:46:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.769 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.769 "name": "Existed_Raid", 00:18:31.769 "uuid": "371ea668-02d6-45d1-bc80-d89880503c97", 00:18:31.769 "strip_size_kb": 0, 00:18:31.769 "state": "online", 00:18:31.769 "raid_level": "raid1", 00:18:31.769 "superblock": true, 00:18:31.769 "num_base_bdevs": 4, 00:18:31.769 "num_base_bdevs_discovered": 3, 00:18:31.769 "num_base_bdevs_operational": 3, 00:18:31.769 "base_bdevs_list": [ 00:18:31.769 { 00:18:31.769 "name": null, 00:18:31.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.769 "is_configured": false, 00:18:31.769 "data_offset": 2048, 00:18:31.769 "data_size": 63488 00:18:31.769 }, 00:18:31.769 { 00:18:31.769 "name": "BaseBdev2", 00:18:31.769 "uuid": "66285652-3f8c-4f44-9a63-db66a5860efc", 00:18:31.769 "is_configured": true, 00:18:31.769 "data_offset": 2048, 00:18:31.769 "data_size": 63488 00:18:31.769 }, 00:18:31.769 { 00:18:31.769 "name": "BaseBdev3", 00:18:31.769 "uuid": "69406e98-11cd-4217-b99b-3809b7b61734", 00:18:31.769 "is_configured": true, 00:18:31.769 "data_offset": 2048, 00:18:31.769 "data_size": 63488 00:18:31.769 }, 00:18:31.769 { 00:18:31.769 "name": "BaseBdev4", 00:18:31.769 "uuid": "f22cdfe7-ec10-4cdc-9416-fff27db2e2e7", 00:18:31.769 "is_configured": true, 00:18:31.769 "data_offset": 2048, 00:18:31.769 "data_size": 63488 00:18:31.769 } 00:18:31.769 ] 00:18:31.769 }' 00:18:31.769 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.769 13:46:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:32.338 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:32.338 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:32.338 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:32.338 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.598 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:32.598 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:32.598 13:46:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:32.858 [2024-06-10 13:46:47.151632] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:32.858 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:32.858 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:32.858 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.858 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:33.118 [2024-06-10 13:46:47.546783] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.118 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:33.378 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:33.378 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:33.378 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:33.638 [2024-06-10 13:46:47.941841] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:33.638 [2024-06-10 13:46:47.941903] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:33.638 [2024-06-10 13:46:47.948095] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:33.638 [2024-06-10 13:46:47.948120] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:33.638 [2024-06-10 13:46:47.948126] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2087030 name Existed_Raid, state offline 00:18:33.638 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:33.638 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:33.638 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.638 13:46:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:33.898 BaseBdev2 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:33.898 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.158 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:34.418 [ 00:18:34.418 { 00:18:34.418 "name": "BaseBdev2", 00:18:34.418 "aliases": [ 00:18:34.418 "1f16dd60-fedc-4a0d-af2e-49f497e9e18a" 00:18:34.418 ], 00:18:34.418 "product_name": "Malloc disk", 00:18:34.418 "block_size": 512, 00:18:34.418 "num_blocks": 65536, 00:18:34.418 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:34.418 "assigned_rate_limits": { 00:18:34.418 "rw_ios_per_sec": 0, 00:18:34.418 "rw_mbytes_per_sec": 0, 00:18:34.418 "r_mbytes_per_sec": 0, 00:18:34.418 "w_mbytes_per_sec": 0 00:18:34.418 }, 00:18:34.418 "claimed": false, 00:18:34.418 "zoned": false, 00:18:34.418 "supported_io_types": { 00:18:34.418 "read": true, 00:18:34.418 "write": true, 00:18:34.418 "unmap": true, 00:18:34.418 "write_zeroes": true, 00:18:34.418 "flush": true, 00:18:34.418 "reset": true, 00:18:34.418 "compare": false, 00:18:34.418 "compare_and_write": false, 00:18:34.418 "abort": true, 00:18:34.418 "nvme_admin": false, 00:18:34.418 "nvme_io": false 00:18:34.418 }, 00:18:34.418 "memory_domains": [ 00:18:34.418 { 00:18:34.418 "dma_device_id": "system", 00:18:34.418 "dma_device_type": 1 00:18:34.418 }, 00:18:34.418 { 00:18:34.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.418 "dma_device_type": 2 00:18:34.418 } 00:18:34.418 ], 00:18:34.418 "driver_specific": {} 00:18:34.418 } 00:18:34.418 ] 00:18:34.418 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:34.418 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:34.418 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:34.418 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:34.679 BaseBdev3 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:34.679 13:46:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.679 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:34.939 [ 00:18:34.939 { 00:18:34.939 "name": "BaseBdev3", 00:18:34.939 "aliases": [ 00:18:34.939 "3562d15c-4c58-4f49-8fb1-1893fa413dab" 00:18:34.939 ], 00:18:34.939 "product_name": "Malloc disk", 00:18:34.939 "block_size": 512, 00:18:34.939 "num_blocks": 65536, 00:18:34.939 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:34.939 "assigned_rate_limits": { 00:18:34.939 "rw_ios_per_sec": 0, 00:18:34.939 "rw_mbytes_per_sec": 0, 00:18:34.939 "r_mbytes_per_sec": 0, 00:18:34.939 "w_mbytes_per_sec": 0 00:18:34.939 }, 00:18:34.939 "claimed": false, 00:18:34.939 "zoned": false, 00:18:34.939 "supported_io_types": { 00:18:34.939 "read": true, 00:18:34.939 "write": true, 00:18:34.939 "unmap": true, 00:18:34.939 "write_zeroes": true, 00:18:34.939 "flush": true, 00:18:34.939 "reset": true, 00:18:34.939 "compare": false, 00:18:34.939 "compare_and_write": false, 00:18:34.939 "abort": true, 00:18:34.939 "nvme_admin": false, 00:18:34.939 "nvme_io": false 00:18:34.939 }, 00:18:34.939 "memory_domains": [ 00:18:34.939 { 00:18:34.939 "dma_device_id": "system", 00:18:34.939 "dma_device_type": 1 00:18:34.939 }, 00:18:34.939 { 00:18:34.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.939 "dma_device_type": 2 00:18:34.939 } 00:18:34.939 ], 00:18:34.939 "driver_specific": {} 00:18:34.939 } 00:18:34.939 ] 00:18:34.939 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:34.939 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:34.939 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:34.939 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:35.199 BaseBdev4 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:35.199 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:35.459 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:35.459 [ 00:18:35.459 { 00:18:35.459 "name": "BaseBdev4", 00:18:35.459 "aliases": [ 00:18:35.459 "261c6026-be89-4234-a508-3cd8b711f4ff" 00:18:35.459 ], 00:18:35.459 "product_name": "Malloc disk", 00:18:35.459 "block_size": 512, 00:18:35.459 "num_blocks": 65536, 00:18:35.459 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:35.459 "assigned_rate_limits": { 00:18:35.459 "rw_ios_per_sec": 0, 00:18:35.459 "rw_mbytes_per_sec": 0, 00:18:35.459 "r_mbytes_per_sec": 0, 00:18:35.459 "w_mbytes_per_sec": 0 00:18:35.459 }, 00:18:35.459 "claimed": false, 00:18:35.459 "zoned": false, 00:18:35.459 "supported_io_types": { 00:18:35.459 "read": true, 00:18:35.459 "write": true, 00:18:35.459 "unmap": true, 00:18:35.459 "write_zeroes": true, 00:18:35.459 "flush": true, 00:18:35.459 "reset": true, 00:18:35.459 "compare": false, 00:18:35.459 "compare_and_write": false, 00:18:35.459 "abort": true, 00:18:35.459 "nvme_admin": false, 00:18:35.459 "nvme_io": false 00:18:35.459 }, 00:18:35.459 "memory_domains": [ 00:18:35.459 { 00:18:35.459 "dma_device_id": "system", 00:18:35.459 "dma_device_type": 1 00:18:35.459 }, 00:18:35.459 { 00:18:35.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.459 "dma_device_type": 2 00:18:35.459 } 00:18:35.459 ], 00:18:35.459 "driver_specific": {} 00:18:35.459 } 00:18:35.459 ] 00:18:35.720 13:46:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:35.720 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:35.720 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:35.720 13:46:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:35.720 [2024-06-10 13:46:50.133778] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:35.720 [2024-06-10 13:46:50.133811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:35.720 [2024-06-10 13:46:50.133825] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:35.720 [2024-06-10 13:46:50.134941] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:35.720 [2024-06-10 13:46:50.134976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.720 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:35.980 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.980 "name": "Existed_Raid", 00:18:35.980 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:35.980 "strip_size_kb": 0, 00:18:35.980 "state": "configuring", 00:18:35.980 "raid_level": "raid1", 00:18:35.980 "superblock": true, 00:18:35.980 "num_base_bdevs": 4, 00:18:35.980 "num_base_bdevs_discovered": 3, 00:18:35.980 "num_base_bdevs_operational": 4, 00:18:35.980 "base_bdevs_list": [ 00:18:35.980 { 00:18:35.980 "name": "BaseBdev1", 00:18:35.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.980 "is_configured": false, 00:18:35.980 "data_offset": 0, 00:18:35.980 "data_size": 0 00:18:35.980 }, 00:18:35.980 { 00:18:35.980 "name": "BaseBdev2", 00:18:35.980 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:35.980 "is_configured": true, 00:18:35.980 "data_offset": 2048, 00:18:35.980 "data_size": 63488 00:18:35.980 }, 00:18:35.980 { 00:18:35.980 "name": "BaseBdev3", 00:18:35.980 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:35.980 "is_configured": true, 00:18:35.980 "data_offset": 2048, 00:18:35.980 "data_size": 63488 00:18:35.980 }, 00:18:35.980 { 00:18:35.980 "name": "BaseBdev4", 00:18:35.980 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:35.980 "is_configured": true, 00:18:35.980 "data_offset": 2048, 00:18:35.980 "data_size": 63488 00:18:35.980 } 00:18:35.980 ] 00:18:35.980 }' 00:18:35.980 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.980 13:46:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:36.550 13:46:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:36.810 [2024-06-10 13:46:51.084151] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.810 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:37.070 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.070 "name": "Existed_Raid", 00:18:37.070 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:37.070 "strip_size_kb": 0, 00:18:37.070 "state": "configuring", 00:18:37.070 "raid_level": "raid1", 00:18:37.070 "superblock": true, 00:18:37.070 "num_base_bdevs": 4, 00:18:37.070 "num_base_bdevs_discovered": 2, 00:18:37.070 "num_base_bdevs_operational": 4, 00:18:37.070 "base_bdevs_list": [ 00:18:37.070 { 00:18:37.070 "name": "BaseBdev1", 00:18:37.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:37.070 "is_configured": false, 00:18:37.070 "data_offset": 0, 00:18:37.070 "data_size": 0 00:18:37.070 }, 00:18:37.070 { 00:18:37.070 "name": null, 00:18:37.070 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:37.070 "is_configured": false, 00:18:37.070 "data_offset": 2048, 00:18:37.070 "data_size": 63488 00:18:37.070 }, 00:18:37.070 { 00:18:37.070 "name": "BaseBdev3", 00:18:37.070 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:37.070 "is_configured": true, 00:18:37.070 "data_offset": 2048, 00:18:37.070 "data_size": 63488 00:18:37.070 }, 00:18:37.070 { 00:18:37.070 "name": "BaseBdev4", 00:18:37.070 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:37.070 "is_configured": true, 00:18:37.070 "data_offset": 2048, 00:18:37.070 "data_size": 63488 00:18:37.070 } 00:18:37.070 ] 00:18:37.070 }' 00:18:37.070 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.070 13:46:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.640 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:37.640 13:46:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.640 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:37.640 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:37.899 [2024-06-10 13:46:52.232217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:37.899 BaseBdev1 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:37.899 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:38.158 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:38.158 [ 00:18:38.158 { 00:18:38.158 "name": "BaseBdev1", 00:18:38.158 "aliases": [ 00:18:38.159 "1fa36beb-2f1a-4980-b21d-100e804898f2" 00:18:38.159 ], 00:18:38.159 "product_name": "Malloc disk", 00:18:38.159 "block_size": 512, 00:18:38.159 "num_blocks": 65536, 00:18:38.159 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:38.159 "assigned_rate_limits": { 00:18:38.159 "rw_ios_per_sec": 0, 00:18:38.159 "rw_mbytes_per_sec": 0, 00:18:38.159 "r_mbytes_per_sec": 0, 00:18:38.159 "w_mbytes_per_sec": 0 00:18:38.159 }, 00:18:38.159 "claimed": true, 00:18:38.159 "claim_type": "exclusive_write", 00:18:38.159 "zoned": false, 00:18:38.159 "supported_io_types": { 00:18:38.159 "read": true, 00:18:38.159 "write": true, 00:18:38.159 "unmap": true, 00:18:38.159 "write_zeroes": true, 00:18:38.159 "flush": true, 00:18:38.159 "reset": true, 00:18:38.159 "compare": false, 00:18:38.159 "compare_and_write": false, 00:18:38.159 "abort": true, 00:18:38.159 "nvme_admin": false, 00:18:38.159 "nvme_io": false 00:18:38.159 }, 00:18:38.159 "memory_domains": [ 00:18:38.159 { 00:18:38.159 "dma_device_id": "system", 00:18:38.159 "dma_device_type": 1 00:18:38.159 }, 00:18:38.159 { 00:18:38.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.159 "dma_device_type": 2 00:18:38.159 } 00:18:38.159 ], 00:18:38.159 "driver_specific": {} 00:18:38.159 } 00:18:38.159 ] 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:38.159 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.418 "name": "Existed_Raid", 00:18:38.418 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:38.418 "strip_size_kb": 0, 00:18:38.418 "state": "configuring", 00:18:38.418 "raid_level": "raid1", 00:18:38.418 "superblock": true, 00:18:38.418 "num_base_bdevs": 4, 00:18:38.418 "num_base_bdevs_discovered": 3, 00:18:38.418 "num_base_bdevs_operational": 4, 00:18:38.418 "base_bdevs_list": [ 00:18:38.418 { 00:18:38.418 "name": "BaseBdev1", 00:18:38.418 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:38.418 "is_configured": true, 00:18:38.418 "data_offset": 2048, 00:18:38.418 "data_size": 63488 00:18:38.418 }, 00:18:38.418 { 00:18:38.418 "name": null, 00:18:38.418 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:38.418 "is_configured": false, 00:18:38.418 "data_offset": 2048, 00:18:38.418 "data_size": 63488 00:18:38.418 }, 00:18:38.418 { 00:18:38.418 "name": "BaseBdev3", 00:18:38.418 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:38.418 "is_configured": true, 00:18:38.418 "data_offset": 2048, 00:18:38.418 "data_size": 63488 00:18:38.418 }, 00:18:38.418 { 00:18:38.418 "name": "BaseBdev4", 00:18:38.418 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:38.418 "is_configured": true, 00:18:38.418 "data_offset": 2048, 00:18:38.418 "data_size": 63488 00:18:38.418 } 00:18:38.418 ] 00:18:38.418 }' 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.418 13:46:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:38.986 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.986 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:39.245 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:39.245 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:39.505 [2024-06-10 13:46:53.780166] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.505 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:39.765 13:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.765 "name": "Existed_Raid", 00:18:39.765 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:39.765 "strip_size_kb": 0, 00:18:39.765 "state": "configuring", 00:18:39.765 "raid_level": "raid1", 00:18:39.765 "superblock": true, 00:18:39.765 "num_base_bdevs": 4, 00:18:39.765 "num_base_bdevs_discovered": 2, 00:18:39.765 "num_base_bdevs_operational": 4, 00:18:39.765 "base_bdevs_list": [ 00:18:39.765 { 00:18:39.765 "name": "BaseBdev1", 00:18:39.765 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:39.765 "is_configured": true, 00:18:39.765 "data_offset": 2048, 00:18:39.765 "data_size": 63488 00:18:39.765 }, 00:18:39.765 { 00:18:39.765 "name": null, 00:18:39.765 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:39.765 "is_configured": false, 00:18:39.765 "data_offset": 2048, 00:18:39.765 "data_size": 63488 00:18:39.765 }, 00:18:39.765 { 00:18:39.765 "name": null, 00:18:39.765 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:39.765 "is_configured": false, 00:18:39.765 "data_offset": 2048, 00:18:39.765 "data_size": 63488 00:18:39.765 }, 00:18:39.765 { 00:18:39.765 "name": "BaseBdev4", 00:18:39.765 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:39.765 "is_configured": true, 00:18:39.765 "data_offset": 2048, 00:18:39.765 "data_size": 63488 00:18:39.765 } 00:18:39.765 ] 00:18:39.765 }' 00:18:39.765 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.765 13:46:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:40.334 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.334 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:40.334 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:40.334 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:40.595 [2024-06-10 13:46:54.923072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.595 13:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.856 13:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.856 "name": "Existed_Raid", 00:18:40.856 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:40.856 "strip_size_kb": 0, 00:18:40.856 "state": "configuring", 00:18:40.856 "raid_level": "raid1", 00:18:40.856 "superblock": true, 00:18:40.856 "num_base_bdevs": 4, 00:18:40.856 "num_base_bdevs_discovered": 3, 00:18:40.856 "num_base_bdevs_operational": 4, 00:18:40.856 "base_bdevs_list": [ 00:18:40.856 { 00:18:40.856 "name": "BaseBdev1", 00:18:40.856 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:40.856 "is_configured": true, 00:18:40.856 "data_offset": 2048, 00:18:40.856 "data_size": 63488 00:18:40.856 }, 00:18:40.856 { 00:18:40.856 "name": null, 00:18:40.856 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:40.856 "is_configured": false, 00:18:40.856 "data_offset": 2048, 00:18:40.856 "data_size": 63488 00:18:40.856 }, 00:18:40.856 { 00:18:40.856 "name": "BaseBdev3", 00:18:40.856 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:40.856 "is_configured": true, 00:18:40.856 "data_offset": 2048, 00:18:40.856 "data_size": 63488 00:18:40.856 }, 00:18:40.856 { 00:18:40.856 "name": "BaseBdev4", 00:18:40.856 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:40.856 "is_configured": true, 00:18:40.856 "data_offset": 2048, 00:18:40.856 "data_size": 63488 00:18:40.856 } 00:18:40.856 ] 00:18:40.856 }' 00:18:40.856 13:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.856 13:46:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.427 13:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:41.427 13:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.687 13:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:41.687 13:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:41.687 [2024-06-10 13:46:56.102071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.687 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.947 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.947 "name": "Existed_Raid", 00:18:41.947 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:41.947 "strip_size_kb": 0, 00:18:41.947 "state": "configuring", 00:18:41.947 "raid_level": "raid1", 00:18:41.947 "superblock": true, 00:18:41.947 "num_base_bdevs": 4, 00:18:41.947 "num_base_bdevs_discovered": 2, 00:18:41.947 "num_base_bdevs_operational": 4, 00:18:41.947 "base_bdevs_list": [ 00:18:41.947 { 00:18:41.947 "name": null, 00:18:41.947 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:41.947 "is_configured": false, 00:18:41.947 "data_offset": 2048, 00:18:41.947 "data_size": 63488 00:18:41.947 }, 00:18:41.947 { 00:18:41.947 "name": null, 00:18:41.947 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:41.947 "is_configured": false, 00:18:41.947 "data_offset": 2048, 00:18:41.947 "data_size": 63488 00:18:41.947 }, 00:18:41.948 { 00:18:41.948 "name": "BaseBdev3", 00:18:41.948 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:41.948 "is_configured": true, 00:18:41.948 "data_offset": 2048, 00:18:41.948 "data_size": 63488 00:18:41.948 }, 00:18:41.948 { 00:18:41.948 "name": "BaseBdev4", 00:18:41.948 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:41.948 "is_configured": true, 00:18:41.948 "data_offset": 2048, 00:18:41.948 "data_size": 63488 00:18:41.948 } 00:18:41.948 ] 00:18:41.948 }' 00:18:41.948 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.948 13:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.517 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.517 13:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:42.776 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:42.776 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:43.036 [2024-06-10 13:46:57.275182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.036 "name": "Existed_Raid", 00:18:43.036 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:43.036 "strip_size_kb": 0, 00:18:43.036 "state": "configuring", 00:18:43.036 "raid_level": "raid1", 00:18:43.036 "superblock": true, 00:18:43.036 "num_base_bdevs": 4, 00:18:43.036 "num_base_bdevs_discovered": 3, 00:18:43.036 "num_base_bdevs_operational": 4, 00:18:43.036 "base_bdevs_list": [ 00:18:43.036 { 00:18:43.036 "name": null, 00:18:43.036 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:43.036 "is_configured": false, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 }, 00:18:43.036 { 00:18:43.036 "name": "BaseBdev2", 00:18:43.036 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:43.036 "is_configured": true, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 }, 00:18:43.036 { 00:18:43.036 "name": "BaseBdev3", 00:18:43.036 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:43.036 "is_configured": true, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 }, 00:18:43.036 { 00:18:43.036 "name": "BaseBdev4", 00:18:43.036 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:43.036 "is_configured": true, 00:18:43.036 "data_offset": 2048, 00:18:43.036 "data_size": 63488 00:18:43.036 } 00:18:43.036 ] 00:18:43.036 }' 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.036 13:46:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:43.606 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.606 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:43.866 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:43.866 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.866 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:44.127 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1fa36beb-2f1a-4980-b21d-100e804898f2 00:18:44.388 [2024-06-10 13:46:58.615677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:44.388 [2024-06-10 13:46:58.615795] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20889a0 00:18:44.388 [2024-06-10 13:46:58.615803] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:44.388 [2024-06-10 13:46:58.615947] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223a250 00:18:44.388 [2024-06-10 13:46:58.616045] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20889a0 00:18:44.388 [2024-06-10 13:46:58.616050] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20889a0 00:18:44.388 [2024-06-10 13:46:58.616123] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:44.388 NewBaseBdev 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:44.388 13:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:44.652 [ 00:18:44.652 { 00:18:44.652 "name": "NewBaseBdev", 00:18:44.652 "aliases": [ 00:18:44.652 "1fa36beb-2f1a-4980-b21d-100e804898f2" 00:18:44.652 ], 00:18:44.652 "product_name": "Malloc disk", 00:18:44.652 "block_size": 512, 00:18:44.652 "num_blocks": 65536, 00:18:44.652 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:44.652 "assigned_rate_limits": { 00:18:44.652 "rw_ios_per_sec": 0, 00:18:44.652 "rw_mbytes_per_sec": 0, 00:18:44.652 "r_mbytes_per_sec": 0, 00:18:44.652 "w_mbytes_per_sec": 0 00:18:44.652 }, 00:18:44.652 "claimed": true, 00:18:44.652 "claim_type": "exclusive_write", 00:18:44.652 "zoned": false, 00:18:44.652 "supported_io_types": { 00:18:44.652 "read": true, 00:18:44.652 "write": true, 00:18:44.652 "unmap": true, 00:18:44.652 "write_zeroes": true, 00:18:44.652 "flush": true, 00:18:44.652 "reset": true, 00:18:44.652 "compare": false, 00:18:44.652 "compare_and_write": false, 00:18:44.652 "abort": true, 00:18:44.652 "nvme_admin": false, 00:18:44.652 "nvme_io": false 00:18:44.652 }, 00:18:44.652 "memory_domains": [ 00:18:44.652 { 00:18:44.652 "dma_device_id": "system", 00:18:44.652 "dma_device_type": 1 00:18:44.652 }, 00:18:44.652 { 00:18:44.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.652 "dma_device_type": 2 00:18:44.652 } 00:18:44.652 ], 00:18:44.652 "driver_specific": {} 00:18:44.652 } 00:18:44.652 ] 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.652 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.970 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.970 "name": "Existed_Raid", 00:18:44.970 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:44.970 "strip_size_kb": 0, 00:18:44.970 "state": "online", 00:18:44.970 "raid_level": "raid1", 00:18:44.970 "superblock": true, 00:18:44.970 "num_base_bdevs": 4, 00:18:44.970 "num_base_bdevs_discovered": 4, 00:18:44.970 "num_base_bdevs_operational": 4, 00:18:44.970 "base_bdevs_list": [ 00:18:44.970 { 00:18:44.970 "name": "NewBaseBdev", 00:18:44.970 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:44.970 "is_configured": true, 00:18:44.970 "data_offset": 2048, 00:18:44.970 "data_size": 63488 00:18:44.970 }, 00:18:44.970 { 00:18:44.970 "name": "BaseBdev2", 00:18:44.970 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:44.970 "is_configured": true, 00:18:44.970 "data_offset": 2048, 00:18:44.970 "data_size": 63488 00:18:44.970 }, 00:18:44.970 { 00:18:44.970 "name": "BaseBdev3", 00:18:44.970 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:44.970 "is_configured": true, 00:18:44.970 "data_offset": 2048, 00:18:44.970 "data_size": 63488 00:18:44.970 }, 00:18:44.970 { 00:18:44.970 "name": "BaseBdev4", 00:18:44.970 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:44.970 "is_configured": true, 00:18:44.970 "data_offset": 2048, 00:18:44.970 "data_size": 63488 00:18:44.970 } 00:18:44.970 ] 00:18:44.970 }' 00:18:44.970 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.970 13:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:45.566 [2024-06-10 13:46:59.947297] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:45.566 "name": "Existed_Raid", 00:18:45.566 "aliases": [ 00:18:45.566 "fbc750ef-82e2-400b-b687-424f91f83f3a" 00:18:45.566 ], 00:18:45.566 "product_name": "Raid Volume", 00:18:45.566 "block_size": 512, 00:18:45.566 "num_blocks": 63488, 00:18:45.566 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:45.566 "assigned_rate_limits": { 00:18:45.566 "rw_ios_per_sec": 0, 00:18:45.566 "rw_mbytes_per_sec": 0, 00:18:45.566 "r_mbytes_per_sec": 0, 00:18:45.566 "w_mbytes_per_sec": 0 00:18:45.566 }, 00:18:45.566 "claimed": false, 00:18:45.566 "zoned": false, 00:18:45.566 "supported_io_types": { 00:18:45.566 "read": true, 00:18:45.566 "write": true, 00:18:45.566 "unmap": false, 00:18:45.566 "write_zeroes": true, 00:18:45.566 "flush": false, 00:18:45.566 "reset": true, 00:18:45.566 "compare": false, 00:18:45.566 "compare_and_write": false, 00:18:45.566 "abort": false, 00:18:45.566 "nvme_admin": false, 00:18:45.566 "nvme_io": false 00:18:45.566 }, 00:18:45.566 "memory_domains": [ 00:18:45.566 { 00:18:45.566 "dma_device_id": "system", 00:18:45.566 "dma_device_type": 1 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.566 "dma_device_type": 2 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "system", 00:18:45.566 "dma_device_type": 1 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.566 "dma_device_type": 2 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "system", 00:18:45.566 "dma_device_type": 1 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.566 "dma_device_type": 2 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "system", 00:18:45.566 "dma_device_type": 1 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.566 "dma_device_type": 2 00:18:45.566 } 00:18:45.566 ], 00:18:45.566 "driver_specific": { 00:18:45.566 "raid": { 00:18:45.566 "uuid": "fbc750ef-82e2-400b-b687-424f91f83f3a", 00:18:45.566 "strip_size_kb": 0, 00:18:45.566 "state": "online", 00:18:45.566 "raid_level": "raid1", 00:18:45.566 "superblock": true, 00:18:45.566 "num_base_bdevs": 4, 00:18:45.566 "num_base_bdevs_discovered": 4, 00:18:45.566 "num_base_bdevs_operational": 4, 00:18:45.566 "base_bdevs_list": [ 00:18:45.566 { 00:18:45.566 "name": "NewBaseBdev", 00:18:45.566 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:45.566 "is_configured": true, 00:18:45.566 "data_offset": 2048, 00:18:45.566 "data_size": 63488 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "name": "BaseBdev2", 00:18:45.566 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:45.566 "is_configured": true, 00:18:45.566 "data_offset": 2048, 00:18:45.566 "data_size": 63488 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "name": "BaseBdev3", 00:18:45.566 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:45.566 "is_configured": true, 00:18:45.566 "data_offset": 2048, 00:18:45.566 "data_size": 63488 00:18:45.566 }, 00:18:45.566 { 00:18:45.566 "name": "BaseBdev4", 00:18:45.566 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:45.566 "is_configured": true, 00:18:45.566 "data_offset": 2048, 00:18:45.566 "data_size": 63488 00:18:45.566 } 00:18:45.566 ] 00:18:45.566 } 00:18:45.566 } 00:18:45.566 }' 00:18:45.566 13:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:45.567 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:45.567 BaseBdev2 00:18:45.567 BaseBdev3 00:18:45.567 BaseBdev4' 00:18:45.567 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.567 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:45.567 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.826 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:45.826 "name": "NewBaseBdev", 00:18:45.826 "aliases": [ 00:18:45.826 "1fa36beb-2f1a-4980-b21d-100e804898f2" 00:18:45.826 ], 00:18:45.826 "product_name": "Malloc disk", 00:18:45.826 "block_size": 512, 00:18:45.826 "num_blocks": 65536, 00:18:45.826 "uuid": "1fa36beb-2f1a-4980-b21d-100e804898f2", 00:18:45.827 "assigned_rate_limits": { 00:18:45.827 "rw_ios_per_sec": 0, 00:18:45.827 "rw_mbytes_per_sec": 0, 00:18:45.827 "r_mbytes_per_sec": 0, 00:18:45.827 "w_mbytes_per_sec": 0 00:18:45.827 }, 00:18:45.827 "claimed": true, 00:18:45.827 "claim_type": "exclusive_write", 00:18:45.827 "zoned": false, 00:18:45.827 "supported_io_types": { 00:18:45.827 "read": true, 00:18:45.827 "write": true, 00:18:45.827 "unmap": true, 00:18:45.827 "write_zeroes": true, 00:18:45.827 "flush": true, 00:18:45.827 "reset": true, 00:18:45.827 "compare": false, 00:18:45.827 "compare_and_write": false, 00:18:45.827 "abort": true, 00:18:45.827 "nvme_admin": false, 00:18:45.827 "nvme_io": false 00:18:45.827 }, 00:18:45.827 "memory_domains": [ 00:18:45.827 { 00:18:45.827 "dma_device_id": "system", 00:18:45.827 "dma_device_type": 1 00:18:45.827 }, 00:18:45.827 { 00:18:45.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.827 "dma_device_type": 2 00:18:45.827 } 00:18:45.827 ], 00:18:45.827 "driver_specific": {} 00:18:45.827 }' 00:18:45.827 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.827 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.087 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.347 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.347 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.347 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:46.347 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.347 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.347 "name": "BaseBdev2", 00:18:46.347 "aliases": [ 00:18:46.347 "1f16dd60-fedc-4a0d-af2e-49f497e9e18a" 00:18:46.347 ], 00:18:46.347 "product_name": "Malloc disk", 00:18:46.347 "block_size": 512, 00:18:46.347 "num_blocks": 65536, 00:18:46.347 "uuid": "1f16dd60-fedc-4a0d-af2e-49f497e9e18a", 00:18:46.347 "assigned_rate_limits": { 00:18:46.347 "rw_ios_per_sec": 0, 00:18:46.347 "rw_mbytes_per_sec": 0, 00:18:46.347 "r_mbytes_per_sec": 0, 00:18:46.347 "w_mbytes_per_sec": 0 00:18:46.347 }, 00:18:46.347 "claimed": true, 00:18:46.347 "claim_type": "exclusive_write", 00:18:46.347 "zoned": false, 00:18:46.347 "supported_io_types": { 00:18:46.347 "read": true, 00:18:46.347 "write": true, 00:18:46.347 "unmap": true, 00:18:46.347 "write_zeroes": true, 00:18:46.347 "flush": true, 00:18:46.347 "reset": true, 00:18:46.347 "compare": false, 00:18:46.347 "compare_and_write": false, 00:18:46.347 "abort": true, 00:18:46.347 "nvme_admin": false, 00:18:46.347 "nvme_io": false 00:18:46.347 }, 00:18:46.347 "memory_domains": [ 00:18:46.347 { 00:18:46.347 "dma_device_id": "system", 00:18:46.347 "dma_device_type": 1 00:18:46.347 }, 00:18:46.347 { 00:18:46.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.347 "dma_device_type": 2 00:18:46.347 } 00:18:46.347 ], 00:18:46.347 "driver_specific": {} 00:18:46.347 }' 00:18:46.347 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.607 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.607 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.607 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.607 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.607 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.607 13:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.607 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.607 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.607 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.867 "name": "BaseBdev3", 00:18:46.867 "aliases": [ 00:18:46.867 "3562d15c-4c58-4f49-8fb1-1893fa413dab" 00:18:46.867 ], 00:18:46.867 "product_name": "Malloc disk", 00:18:46.867 "block_size": 512, 00:18:46.867 "num_blocks": 65536, 00:18:46.867 "uuid": "3562d15c-4c58-4f49-8fb1-1893fa413dab", 00:18:46.867 "assigned_rate_limits": { 00:18:46.867 "rw_ios_per_sec": 0, 00:18:46.867 "rw_mbytes_per_sec": 0, 00:18:46.867 "r_mbytes_per_sec": 0, 00:18:46.867 "w_mbytes_per_sec": 0 00:18:46.867 }, 00:18:46.867 "claimed": true, 00:18:46.867 "claim_type": "exclusive_write", 00:18:46.867 "zoned": false, 00:18:46.867 "supported_io_types": { 00:18:46.867 "read": true, 00:18:46.867 "write": true, 00:18:46.867 "unmap": true, 00:18:46.867 "write_zeroes": true, 00:18:46.867 "flush": true, 00:18:46.867 "reset": true, 00:18:46.867 "compare": false, 00:18:46.867 "compare_and_write": false, 00:18:46.867 "abort": true, 00:18:46.867 "nvme_admin": false, 00:18:46.867 "nvme_io": false 00:18:46.867 }, 00:18:46.867 "memory_domains": [ 00:18:46.867 { 00:18:46.867 "dma_device_id": "system", 00:18:46.867 "dma_device_type": 1 00:18:46.867 }, 00:18:46.867 { 00:18:46.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.867 "dma_device_type": 2 00:18:46.867 } 00:18:46.867 ], 00:18:46.867 "driver_specific": {} 00:18:46.867 }' 00:18:46.867 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.126 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.386 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.386 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.386 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.386 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.386 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:47.387 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:47.387 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.647 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.647 "name": "BaseBdev4", 00:18:47.647 "aliases": [ 00:18:47.647 "261c6026-be89-4234-a508-3cd8b711f4ff" 00:18:47.647 ], 00:18:47.647 "product_name": "Malloc disk", 00:18:47.647 "block_size": 512, 00:18:47.647 "num_blocks": 65536, 00:18:47.647 "uuid": "261c6026-be89-4234-a508-3cd8b711f4ff", 00:18:47.647 "assigned_rate_limits": { 00:18:47.647 "rw_ios_per_sec": 0, 00:18:47.647 "rw_mbytes_per_sec": 0, 00:18:47.647 "r_mbytes_per_sec": 0, 00:18:47.647 "w_mbytes_per_sec": 0 00:18:47.647 }, 00:18:47.647 "claimed": true, 00:18:47.647 "claim_type": "exclusive_write", 00:18:47.647 "zoned": false, 00:18:47.647 "supported_io_types": { 00:18:47.647 "read": true, 00:18:47.647 "write": true, 00:18:47.647 "unmap": true, 00:18:47.647 "write_zeroes": true, 00:18:47.647 "flush": true, 00:18:47.647 "reset": true, 00:18:47.647 "compare": false, 00:18:47.647 "compare_and_write": false, 00:18:47.647 "abort": true, 00:18:47.647 "nvme_admin": false, 00:18:47.647 "nvme_io": false 00:18:47.647 }, 00:18:47.647 "memory_domains": [ 00:18:47.647 { 00:18:47.647 "dma_device_id": "system", 00:18:47.647 "dma_device_type": 1 00:18:47.647 }, 00:18:47.647 { 00:18:47.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.647 "dma_device_type": 2 00:18:47.647 } 00:18:47.647 ], 00:18:47.647 "driver_specific": {} 00:18:47.647 }' 00:18:47.647 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.647 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.647 13:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.647 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.647 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.647 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.647 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.907 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.907 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:47.907 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.907 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:47.907 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:47.907 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:48.167 [2024-06-10 13:47:02.465446] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:48.167 [2024-06-10 13:47:02.465462] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:48.167 [2024-06-10 13:47:02.465501] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:48.167 [2024-06-10 13:47:02.465729] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:48.167 [2024-06-10 13:47:02.465736] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20889a0 name Existed_Raid, state offline 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1606210 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1606210 ']' 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 1606210 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1606210 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1606210' 00:18:48.167 killing process with pid 1606210 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 1606210 00:18:48.167 [2024-06-10 13:47:02.535651] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:48.167 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 1606210 00:18:48.167 [2024-06-10 13:47:02.556877] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:48.428 13:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:48.428 00:18:48.428 real 0m28.132s 00:18:48.428 user 0m52.737s 00:18:48.428 sys 0m4.103s 00:18:48.428 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:48.428 13:47:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.428 ************************************ 00:18:48.428 END TEST raid_state_function_test_sb 00:18:48.428 ************************************ 00:18:48.428 13:47:02 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:18:48.428 13:47:02 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:18:48.428 13:47:02 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:48.428 13:47:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:48.428 ************************************ 00:18:48.428 START TEST raid_superblock_test 00:18:48.428 ************************************ 00:18:48.428 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 4 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1612474 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1612474 /var/tmp/spdk-raid.sock 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 1612474 ']' 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:48.429 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:48.429 13:47:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.429 [2024-06-10 13:47:02.829178] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:18:48.429 [2024-06-10 13:47:02.829236] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612474 ] 00:18:48.690 [2024-06-10 13:47:02.919796] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:48.690 [2024-06-10 13:47:02.990347] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:18:48.690 [2024-06-10 13:47:03.030226] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:48.690 [2024-06-10 13:47:03.030251] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:49.260 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:49.521 malloc1 00:18:49.521 13:47:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:49.781 [2024-06-10 13:47:04.065444] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:49.781 [2024-06-10 13:47:04.065479] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.781 [2024-06-10 13:47:04.065491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f2550 00:18:49.781 [2024-06-10 13:47:04.065498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.781 [2024-06-10 13:47:04.066842] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.781 [2024-06-10 13:47:04.066863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:49.781 pt1 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:49.781 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:50.042 malloc2 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:50.042 [2024-06-10 13:47:04.476649] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:50.042 [2024-06-10 13:47:04.476678] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.042 [2024-06-10 13:47:04.476687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27b40f0 00:18:50.042 [2024-06-10 13:47:04.476693] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.042 [2024-06-10 13:47:04.477953] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.042 [2024-06-10 13:47:04.477973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:50.042 pt2 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:50.042 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:50.302 malloc3 00:18:50.302 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:50.563 [2024-06-10 13:47:04.879717] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:50.564 [2024-06-10 13:47:04.879749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.564 [2024-06-10 13:47:04.879759] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27b55b0 00:18:50.564 [2024-06-10 13:47:04.879766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.564 [2024-06-10 13:47:04.881020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.564 [2024-06-10 13:47:04.881039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:50.564 pt3 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:50.564 13:47:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:50.823 malloc4 00:18:50.823 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:50.823 [2024-06-10 13:47:05.282759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:50.823 [2024-06-10 13:47:05.282786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.823 [2024-06-10 13:47:05.282796] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27b5d90 00:18:50.823 [2024-06-10 13:47:05.282803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.823 [2024-06-10 13:47:05.284047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.823 [2024-06-10 13:47:05.284066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:50.823 pt4 00:18:50.823 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:50.823 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:50.824 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:51.083 [2024-06-10 13:47:05.483278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:51.083 [2024-06-10 13:47:05.484394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:51.083 [2024-06-10 13:47:05.484438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:51.083 [2024-06-10 13:47:05.484475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:51.084 [2024-06-10 13:47:05.484619] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26ebb60 00:18:51.084 [2024-06-10 13:47:05.484627] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:51.084 [2024-06-10 13:47:05.484783] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27be920 00:18:51.084 [2024-06-10 13:47:05.484901] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26ebb60 00:18:51.084 [2024-06-10 13:47:05.484907] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26ebb60 00:18:51.084 [2024-06-10 13:47:05.484980] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.084 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.344 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.344 "name": "raid_bdev1", 00:18:51.344 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:51.344 "strip_size_kb": 0, 00:18:51.344 "state": "online", 00:18:51.344 "raid_level": "raid1", 00:18:51.344 "superblock": true, 00:18:51.344 "num_base_bdevs": 4, 00:18:51.344 "num_base_bdevs_discovered": 4, 00:18:51.344 "num_base_bdevs_operational": 4, 00:18:51.344 "base_bdevs_list": [ 00:18:51.344 { 00:18:51.344 "name": "pt1", 00:18:51.344 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:51.344 "is_configured": true, 00:18:51.344 "data_offset": 2048, 00:18:51.344 "data_size": 63488 00:18:51.344 }, 00:18:51.344 { 00:18:51.344 "name": "pt2", 00:18:51.344 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:51.344 "is_configured": true, 00:18:51.344 "data_offset": 2048, 00:18:51.344 "data_size": 63488 00:18:51.344 }, 00:18:51.344 { 00:18:51.344 "name": "pt3", 00:18:51.344 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:51.344 "is_configured": true, 00:18:51.344 "data_offset": 2048, 00:18:51.344 "data_size": 63488 00:18:51.344 }, 00:18:51.344 { 00:18:51.344 "name": "pt4", 00:18:51.344 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:51.344 "is_configured": true, 00:18:51.344 "data_offset": 2048, 00:18:51.344 "data_size": 63488 00:18:51.344 } 00:18:51.344 ] 00:18:51.344 }' 00:18:51.344 13:47:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.344 13:47:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:51.915 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:52.176 [2024-06-10 13:47:06.393798] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:52.176 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:52.176 "name": "raid_bdev1", 00:18:52.176 "aliases": [ 00:18:52.176 "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6" 00:18:52.176 ], 00:18:52.176 "product_name": "Raid Volume", 00:18:52.176 "block_size": 512, 00:18:52.176 "num_blocks": 63488, 00:18:52.176 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:52.176 "assigned_rate_limits": { 00:18:52.176 "rw_ios_per_sec": 0, 00:18:52.176 "rw_mbytes_per_sec": 0, 00:18:52.176 "r_mbytes_per_sec": 0, 00:18:52.176 "w_mbytes_per_sec": 0 00:18:52.176 }, 00:18:52.176 "claimed": false, 00:18:52.176 "zoned": false, 00:18:52.176 "supported_io_types": { 00:18:52.176 "read": true, 00:18:52.176 "write": true, 00:18:52.176 "unmap": false, 00:18:52.176 "write_zeroes": true, 00:18:52.176 "flush": false, 00:18:52.176 "reset": true, 00:18:52.176 "compare": false, 00:18:52.176 "compare_and_write": false, 00:18:52.176 "abort": false, 00:18:52.176 "nvme_admin": false, 00:18:52.176 "nvme_io": false 00:18:52.176 }, 00:18:52.176 "memory_domains": [ 00:18:52.176 { 00:18:52.176 "dma_device_id": "system", 00:18:52.176 "dma_device_type": 1 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.176 "dma_device_type": 2 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "system", 00:18:52.176 "dma_device_type": 1 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.176 "dma_device_type": 2 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "system", 00:18:52.176 "dma_device_type": 1 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.176 "dma_device_type": 2 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "system", 00:18:52.176 "dma_device_type": 1 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.176 "dma_device_type": 2 00:18:52.176 } 00:18:52.176 ], 00:18:52.176 "driver_specific": { 00:18:52.176 "raid": { 00:18:52.176 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:52.176 "strip_size_kb": 0, 00:18:52.176 "state": "online", 00:18:52.176 "raid_level": "raid1", 00:18:52.176 "superblock": true, 00:18:52.176 "num_base_bdevs": 4, 00:18:52.176 "num_base_bdevs_discovered": 4, 00:18:52.176 "num_base_bdevs_operational": 4, 00:18:52.176 "base_bdevs_list": [ 00:18:52.176 { 00:18:52.176 "name": "pt1", 00:18:52.176 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:52.176 "is_configured": true, 00:18:52.176 "data_offset": 2048, 00:18:52.176 "data_size": 63488 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "name": "pt2", 00:18:52.176 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:52.176 "is_configured": true, 00:18:52.176 "data_offset": 2048, 00:18:52.176 "data_size": 63488 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "name": "pt3", 00:18:52.176 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:52.176 "is_configured": true, 00:18:52.176 "data_offset": 2048, 00:18:52.176 "data_size": 63488 00:18:52.176 }, 00:18:52.176 { 00:18:52.176 "name": "pt4", 00:18:52.176 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:52.176 "is_configured": true, 00:18:52.176 "data_offset": 2048, 00:18:52.176 "data_size": 63488 00:18:52.176 } 00:18:52.176 ] 00:18:52.176 } 00:18:52.176 } 00:18:52.176 }' 00:18:52.176 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:52.176 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:52.176 pt2 00:18:52.176 pt3 00:18:52.176 pt4' 00:18:52.176 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.176 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:52.176 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.437 "name": "pt1", 00:18:52.437 "aliases": [ 00:18:52.437 "00000000-0000-0000-0000-000000000001" 00:18:52.437 ], 00:18:52.437 "product_name": "passthru", 00:18:52.437 "block_size": 512, 00:18:52.437 "num_blocks": 65536, 00:18:52.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:52.437 "assigned_rate_limits": { 00:18:52.437 "rw_ios_per_sec": 0, 00:18:52.437 "rw_mbytes_per_sec": 0, 00:18:52.437 "r_mbytes_per_sec": 0, 00:18:52.437 "w_mbytes_per_sec": 0 00:18:52.437 }, 00:18:52.437 "claimed": true, 00:18:52.437 "claim_type": "exclusive_write", 00:18:52.437 "zoned": false, 00:18:52.437 "supported_io_types": { 00:18:52.437 "read": true, 00:18:52.437 "write": true, 00:18:52.437 "unmap": true, 00:18:52.437 "write_zeroes": true, 00:18:52.437 "flush": true, 00:18:52.437 "reset": true, 00:18:52.437 "compare": false, 00:18:52.437 "compare_and_write": false, 00:18:52.437 "abort": true, 00:18:52.437 "nvme_admin": false, 00:18:52.437 "nvme_io": false 00:18:52.437 }, 00:18:52.437 "memory_domains": [ 00:18:52.437 { 00:18:52.437 "dma_device_id": "system", 00:18:52.437 "dma_device_type": 1 00:18:52.437 }, 00:18:52.437 { 00:18:52.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.437 "dma_device_type": 2 00:18:52.437 } 00:18:52.437 ], 00:18:52.437 "driver_specific": { 00:18:52.437 "passthru": { 00:18:52.437 "name": "pt1", 00:18:52.437 "base_bdev_name": "malloc1" 00:18:52.437 } 00:18:52.437 } 00:18:52.437 }' 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.437 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:52.698 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:52.698 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.698 13:47:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:52.698 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:52.698 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:52.698 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:52.698 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:52.959 "name": "pt2", 00:18:52.959 "aliases": [ 00:18:52.959 "00000000-0000-0000-0000-000000000002" 00:18:52.959 ], 00:18:52.959 "product_name": "passthru", 00:18:52.959 "block_size": 512, 00:18:52.959 "num_blocks": 65536, 00:18:52.959 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:52.959 "assigned_rate_limits": { 00:18:52.959 "rw_ios_per_sec": 0, 00:18:52.959 "rw_mbytes_per_sec": 0, 00:18:52.959 "r_mbytes_per_sec": 0, 00:18:52.959 "w_mbytes_per_sec": 0 00:18:52.959 }, 00:18:52.959 "claimed": true, 00:18:52.959 "claim_type": "exclusive_write", 00:18:52.959 "zoned": false, 00:18:52.959 "supported_io_types": { 00:18:52.959 "read": true, 00:18:52.959 "write": true, 00:18:52.959 "unmap": true, 00:18:52.959 "write_zeroes": true, 00:18:52.959 "flush": true, 00:18:52.959 "reset": true, 00:18:52.959 "compare": false, 00:18:52.959 "compare_and_write": false, 00:18:52.959 "abort": true, 00:18:52.959 "nvme_admin": false, 00:18:52.959 "nvme_io": false 00:18:52.959 }, 00:18:52.959 "memory_domains": [ 00:18:52.959 { 00:18:52.959 "dma_device_id": "system", 00:18:52.959 "dma_device_type": 1 00:18:52.959 }, 00:18:52.959 { 00:18:52.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.959 "dma_device_type": 2 00:18:52.959 } 00:18:52.959 ], 00:18:52.959 "driver_specific": { 00:18:52.959 "passthru": { 00:18:52.959 "name": "pt2", 00:18:52.959 "base_bdev_name": "malloc2" 00:18:52.959 } 00:18:52.959 } 00:18:52.959 }' 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:52.959 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.220 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:53.481 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.481 "name": "pt3", 00:18:53.481 "aliases": [ 00:18:53.481 "00000000-0000-0000-0000-000000000003" 00:18:53.481 ], 00:18:53.481 "product_name": "passthru", 00:18:53.481 "block_size": 512, 00:18:53.481 "num_blocks": 65536, 00:18:53.481 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:53.481 "assigned_rate_limits": { 00:18:53.481 "rw_ios_per_sec": 0, 00:18:53.481 "rw_mbytes_per_sec": 0, 00:18:53.481 "r_mbytes_per_sec": 0, 00:18:53.481 "w_mbytes_per_sec": 0 00:18:53.481 }, 00:18:53.481 "claimed": true, 00:18:53.481 "claim_type": "exclusive_write", 00:18:53.481 "zoned": false, 00:18:53.481 "supported_io_types": { 00:18:53.481 "read": true, 00:18:53.481 "write": true, 00:18:53.481 "unmap": true, 00:18:53.481 "write_zeroes": true, 00:18:53.481 "flush": true, 00:18:53.481 "reset": true, 00:18:53.481 "compare": false, 00:18:53.481 "compare_and_write": false, 00:18:53.481 "abort": true, 00:18:53.481 "nvme_admin": false, 00:18:53.481 "nvme_io": false 00:18:53.481 }, 00:18:53.481 "memory_domains": [ 00:18:53.481 { 00:18:53.481 "dma_device_id": "system", 00:18:53.481 "dma_device_type": 1 00:18:53.481 }, 00:18:53.481 { 00:18:53.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.481 "dma_device_type": 2 00:18:53.481 } 00:18:53.481 ], 00:18:53.481 "driver_specific": { 00:18:53.481 "passthru": { 00:18:53.481 "name": "pt3", 00:18:53.481 "base_bdev_name": "malloc3" 00:18:53.481 } 00:18:53.481 } 00:18:53.481 }' 00:18:53.481 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.481 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.481 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.481 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.481 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.741 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.741 13:47:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:53.741 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.001 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.001 "name": "pt4", 00:18:54.001 "aliases": [ 00:18:54.001 "00000000-0000-0000-0000-000000000004" 00:18:54.001 ], 00:18:54.001 "product_name": "passthru", 00:18:54.001 "block_size": 512, 00:18:54.001 "num_blocks": 65536, 00:18:54.001 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:54.001 "assigned_rate_limits": { 00:18:54.001 "rw_ios_per_sec": 0, 00:18:54.001 "rw_mbytes_per_sec": 0, 00:18:54.001 "r_mbytes_per_sec": 0, 00:18:54.001 "w_mbytes_per_sec": 0 00:18:54.001 }, 00:18:54.001 "claimed": true, 00:18:54.001 "claim_type": "exclusive_write", 00:18:54.001 "zoned": false, 00:18:54.001 "supported_io_types": { 00:18:54.001 "read": true, 00:18:54.001 "write": true, 00:18:54.001 "unmap": true, 00:18:54.001 "write_zeroes": true, 00:18:54.001 "flush": true, 00:18:54.001 "reset": true, 00:18:54.001 "compare": false, 00:18:54.001 "compare_and_write": false, 00:18:54.001 "abort": true, 00:18:54.001 "nvme_admin": false, 00:18:54.001 "nvme_io": false 00:18:54.001 }, 00:18:54.001 "memory_domains": [ 00:18:54.001 { 00:18:54.001 "dma_device_id": "system", 00:18:54.001 "dma_device_type": 1 00:18:54.001 }, 00:18:54.001 { 00:18:54.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.001 "dma_device_type": 2 00:18:54.001 } 00:18:54.001 ], 00:18:54.001 "driver_specific": { 00:18:54.001 "passthru": { 00:18:54.001 "name": "pt4", 00:18:54.001 "base_bdev_name": "malloc4" 00:18:54.001 } 00:18:54.001 } 00:18:54.001 }' 00:18:54.001 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.001 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.001 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.001 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.001 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:54.261 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:54.522 [2024-06-10 13:47:08.880093] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:54.522 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1f7abc1b-6cbc-4b82-83f0-53c4250a08e6 00:18:54.522 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1f7abc1b-6cbc-4b82-83f0-53c4250a08e6 ']' 00:18:54.522 13:47:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:54.783 [2024-06-10 13:47:09.084376] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:54.783 [2024-06-10 13:47:09.084386] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:54.784 [2024-06-10 13:47:09.084424] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:54.784 [2024-06-10 13:47:09.084493] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:54.784 [2024-06-10 13:47:09.084500] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26ebb60 name raid_bdev1, state offline 00:18:54.784 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.784 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:55.045 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:55.045 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:55.045 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:55.045 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:55.045 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:55.045 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:55.305 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:55.305 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:55.566 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:55.566 13:47:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:55.827 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:56.088 [2024-06-10 13:47:10.428200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:56.088 [2024-06-10 13:47:10.429343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:56.088 [2024-06-10 13:47:10.429378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:56.088 [2024-06-10 13:47:10.429406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:56.088 [2024-06-10 13:47:10.429442] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:56.088 [2024-06-10 13:47:10.429470] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:56.088 [2024-06-10 13:47:10.429485] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:56.088 [2024-06-10 13:47:10.429499] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:56.088 [2024-06-10 13:47:10.429509] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:56.088 [2024-06-10 13:47:10.429515] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f2f10 name raid_bdev1, state configuring 00:18:56.088 request: 00:18:56.088 { 00:18:56.088 "name": "raid_bdev1", 00:18:56.088 "raid_level": "raid1", 00:18:56.088 "base_bdevs": [ 00:18:56.088 "malloc1", 00:18:56.088 "malloc2", 00:18:56.088 "malloc3", 00:18:56.088 "malloc4" 00:18:56.088 ], 00:18:56.088 "superblock": false, 00:18:56.088 "method": "bdev_raid_create", 00:18:56.088 "req_id": 1 00:18:56.088 } 00:18:56.088 Got JSON-RPC error response 00:18:56.088 response: 00:18:56.088 { 00:18:56.088 "code": -17, 00:18:56.088 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:56.088 } 00:18:56.088 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:18:56.088 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:18:56.088 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:18:56.088 13:47:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:18:56.088 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.088 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:56.349 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:56.349 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:56.349 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:56.349 [2024-06-10 13:47:10.821146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:56.349 [2024-06-10 13:47:10.821177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:56.349 [2024-06-10 13:47:10.821189] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f57c0 00:18:56.349 [2024-06-10 13:47:10.821196] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:56.349 [2024-06-10 13:47:10.822543] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:56.349 [2024-06-10 13:47:10.822562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:56.349 [2024-06-10 13:47:10.822607] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:56.349 [2024-06-10 13:47:10.822626] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:56.610 pt1 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.610 13:47:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.610 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.610 "name": "raid_bdev1", 00:18:56.610 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:56.610 "strip_size_kb": 0, 00:18:56.610 "state": "configuring", 00:18:56.610 "raid_level": "raid1", 00:18:56.610 "superblock": true, 00:18:56.610 "num_base_bdevs": 4, 00:18:56.610 "num_base_bdevs_discovered": 1, 00:18:56.610 "num_base_bdevs_operational": 4, 00:18:56.610 "base_bdevs_list": [ 00:18:56.610 { 00:18:56.610 "name": "pt1", 00:18:56.610 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:56.610 "is_configured": true, 00:18:56.610 "data_offset": 2048, 00:18:56.610 "data_size": 63488 00:18:56.610 }, 00:18:56.610 { 00:18:56.610 "name": null, 00:18:56.610 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:56.610 "is_configured": false, 00:18:56.610 "data_offset": 2048, 00:18:56.610 "data_size": 63488 00:18:56.610 }, 00:18:56.610 { 00:18:56.610 "name": null, 00:18:56.610 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:56.610 "is_configured": false, 00:18:56.610 "data_offset": 2048, 00:18:56.610 "data_size": 63488 00:18:56.610 }, 00:18:56.610 { 00:18:56.610 "name": null, 00:18:56.610 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:56.610 "is_configured": false, 00:18:56.610 "data_offset": 2048, 00:18:56.610 "data_size": 63488 00:18:56.610 } 00:18:56.610 ] 00:18:56.610 }' 00:18:56.610 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.610 13:47:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.180 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:57.180 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:57.440 [2024-06-10 13:47:11.771548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:57.440 [2024-06-10 13:47:11.771573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:57.440 [2024-06-10 13:47:11.771583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27b4c10 00:18:57.440 [2024-06-10 13:47:11.771589] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:57.440 [2024-06-10 13:47:11.771845] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:57.440 [2024-06-10 13:47:11.771857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:57.440 [2024-06-10 13:47:11.771896] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:57.440 [2024-06-10 13:47:11.771909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:57.440 pt2 00:18:57.440 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:57.701 [2024-06-10 13:47:11.976076] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.701 13:47:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:57.962 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.962 "name": "raid_bdev1", 00:18:57.962 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:57.962 "strip_size_kb": 0, 00:18:57.962 "state": "configuring", 00:18:57.962 "raid_level": "raid1", 00:18:57.962 "superblock": true, 00:18:57.962 "num_base_bdevs": 4, 00:18:57.962 "num_base_bdevs_discovered": 1, 00:18:57.962 "num_base_bdevs_operational": 4, 00:18:57.962 "base_bdevs_list": [ 00:18:57.962 { 00:18:57.962 "name": "pt1", 00:18:57.962 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:57.962 "is_configured": true, 00:18:57.962 "data_offset": 2048, 00:18:57.962 "data_size": 63488 00:18:57.962 }, 00:18:57.962 { 00:18:57.962 "name": null, 00:18:57.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:57.962 "is_configured": false, 00:18:57.962 "data_offset": 2048, 00:18:57.962 "data_size": 63488 00:18:57.962 }, 00:18:57.962 { 00:18:57.962 "name": null, 00:18:57.962 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:57.962 "is_configured": false, 00:18:57.962 "data_offset": 2048, 00:18:57.962 "data_size": 63488 00:18:57.962 }, 00:18:57.962 { 00:18:57.962 "name": null, 00:18:57.962 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:57.962 "is_configured": false, 00:18:57.962 "data_offset": 2048, 00:18:57.962 "data_size": 63488 00:18:57.962 } 00:18:57.962 ] 00:18:57.962 }' 00:18:57.962 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.962 13:47:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.533 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:58.533 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:58.533 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:58.533 [2024-06-10 13:47:12.922472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:58.533 [2024-06-10 13:47:12.922503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.533 [2024-06-10 13:47:12.922517] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f2a60 00:18:58.533 [2024-06-10 13:47:12.922524] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.533 [2024-06-10 13:47:12.922799] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.533 [2024-06-10 13:47:12.922809] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:58.533 [2024-06-10 13:47:12.922851] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:58.533 [2024-06-10 13:47:12.922863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:58.533 pt2 00:18:58.533 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:58.533 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:58.533 13:47:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:58.793 [2024-06-10 13:47:13.126988] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:58.793 [2024-06-10 13:47:13.127007] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.794 [2024-06-10 13:47:13.127016] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ece60 00:18:58.794 [2024-06-10 13:47:13.127022] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.794 [2024-06-10 13:47:13.127260] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.794 [2024-06-10 13:47:13.127272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:58.794 [2024-06-10 13:47:13.127306] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:58.794 [2024-06-10 13:47:13.127317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:58.794 pt3 00:18:58.794 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:58.794 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:58.794 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:59.054 [2024-06-10 13:47:13.327499] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:59.054 [2024-06-10 13:47:13.327517] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.054 [2024-06-10 13:47:13.327526] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e8ea0 00:18:59.054 [2024-06-10 13:47:13.327532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.054 [2024-06-10 13:47:13.327761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.054 [2024-06-10 13:47:13.327772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:59.054 [2024-06-10 13:47:13.327806] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:59.054 [2024-06-10 13:47:13.327817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:59.054 [2024-06-10 13:47:13.327916] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26e9ac0 00:18:59.054 [2024-06-10 13:47:13.327923] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:59.054 [2024-06-10 13:47:13.328066] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27be920 00:18:59.054 [2024-06-10 13:47:13.328189] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26e9ac0 00:18:59.054 [2024-06-10 13:47:13.328195] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26e9ac0 00:18:59.054 [2024-06-10 13:47:13.328272] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:59.054 pt4 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.054 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:59.314 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.314 "name": "raid_bdev1", 00:18:59.314 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:59.314 "strip_size_kb": 0, 00:18:59.314 "state": "online", 00:18:59.314 "raid_level": "raid1", 00:18:59.314 "superblock": true, 00:18:59.314 "num_base_bdevs": 4, 00:18:59.314 "num_base_bdevs_discovered": 4, 00:18:59.314 "num_base_bdevs_operational": 4, 00:18:59.314 "base_bdevs_list": [ 00:18:59.314 { 00:18:59.314 "name": "pt1", 00:18:59.315 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:59.315 "is_configured": true, 00:18:59.315 "data_offset": 2048, 00:18:59.315 "data_size": 63488 00:18:59.315 }, 00:18:59.315 { 00:18:59.315 "name": "pt2", 00:18:59.315 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:59.315 "is_configured": true, 00:18:59.315 "data_offset": 2048, 00:18:59.315 "data_size": 63488 00:18:59.315 }, 00:18:59.315 { 00:18:59.315 "name": "pt3", 00:18:59.315 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:59.315 "is_configured": true, 00:18:59.315 "data_offset": 2048, 00:18:59.315 "data_size": 63488 00:18:59.315 }, 00:18:59.315 { 00:18:59.315 "name": "pt4", 00:18:59.315 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:59.315 "is_configured": true, 00:18:59.315 "data_offset": 2048, 00:18:59.315 "data_size": 63488 00:18:59.315 } 00:18:59.315 ] 00:18:59.315 }' 00:18:59.315 13:47:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.315 13:47:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:59.887 [2024-06-10 13:47:14.302220] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:59.887 "name": "raid_bdev1", 00:18:59.887 "aliases": [ 00:18:59.887 "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6" 00:18:59.887 ], 00:18:59.887 "product_name": "Raid Volume", 00:18:59.887 "block_size": 512, 00:18:59.887 "num_blocks": 63488, 00:18:59.887 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:59.887 "assigned_rate_limits": { 00:18:59.887 "rw_ios_per_sec": 0, 00:18:59.887 "rw_mbytes_per_sec": 0, 00:18:59.887 "r_mbytes_per_sec": 0, 00:18:59.887 "w_mbytes_per_sec": 0 00:18:59.887 }, 00:18:59.887 "claimed": false, 00:18:59.887 "zoned": false, 00:18:59.887 "supported_io_types": { 00:18:59.887 "read": true, 00:18:59.887 "write": true, 00:18:59.887 "unmap": false, 00:18:59.887 "write_zeroes": true, 00:18:59.887 "flush": false, 00:18:59.887 "reset": true, 00:18:59.887 "compare": false, 00:18:59.887 "compare_and_write": false, 00:18:59.887 "abort": false, 00:18:59.887 "nvme_admin": false, 00:18:59.887 "nvme_io": false 00:18:59.887 }, 00:18:59.887 "memory_domains": [ 00:18:59.887 { 00:18:59.887 "dma_device_id": "system", 00:18:59.887 "dma_device_type": 1 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.887 "dma_device_type": 2 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "system", 00:18:59.887 "dma_device_type": 1 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.887 "dma_device_type": 2 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "system", 00:18:59.887 "dma_device_type": 1 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.887 "dma_device_type": 2 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "system", 00:18:59.887 "dma_device_type": 1 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.887 "dma_device_type": 2 00:18:59.887 } 00:18:59.887 ], 00:18:59.887 "driver_specific": { 00:18:59.887 "raid": { 00:18:59.887 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:18:59.887 "strip_size_kb": 0, 00:18:59.887 "state": "online", 00:18:59.887 "raid_level": "raid1", 00:18:59.887 "superblock": true, 00:18:59.887 "num_base_bdevs": 4, 00:18:59.887 "num_base_bdevs_discovered": 4, 00:18:59.887 "num_base_bdevs_operational": 4, 00:18:59.887 "base_bdevs_list": [ 00:18:59.887 { 00:18:59.887 "name": "pt1", 00:18:59.887 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:59.887 "is_configured": true, 00:18:59.887 "data_offset": 2048, 00:18:59.887 "data_size": 63488 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "name": "pt2", 00:18:59.887 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:59.887 "is_configured": true, 00:18:59.887 "data_offset": 2048, 00:18:59.887 "data_size": 63488 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "name": "pt3", 00:18:59.887 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:59.887 "is_configured": true, 00:18:59.887 "data_offset": 2048, 00:18:59.887 "data_size": 63488 00:18:59.887 }, 00:18:59.887 { 00:18:59.887 "name": "pt4", 00:18:59.887 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:59.887 "is_configured": true, 00:18:59.887 "data_offset": 2048, 00:18:59.887 "data_size": 63488 00:18:59.887 } 00:18:59.887 ] 00:18:59.887 } 00:18:59.887 } 00:18:59.887 }' 00:18:59.887 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:00.148 pt2 00:19:00.148 pt3 00:19:00.148 pt4' 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:00.148 "name": "pt1", 00:19:00.148 "aliases": [ 00:19:00.148 "00000000-0000-0000-0000-000000000001" 00:19:00.148 ], 00:19:00.148 "product_name": "passthru", 00:19:00.148 "block_size": 512, 00:19:00.148 "num_blocks": 65536, 00:19:00.148 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:00.148 "assigned_rate_limits": { 00:19:00.148 "rw_ios_per_sec": 0, 00:19:00.148 "rw_mbytes_per_sec": 0, 00:19:00.148 "r_mbytes_per_sec": 0, 00:19:00.148 "w_mbytes_per_sec": 0 00:19:00.148 }, 00:19:00.148 "claimed": true, 00:19:00.148 "claim_type": "exclusive_write", 00:19:00.148 "zoned": false, 00:19:00.148 "supported_io_types": { 00:19:00.148 "read": true, 00:19:00.148 "write": true, 00:19:00.148 "unmap": true, 00:19:00.148 "write_zeroes": true, 00:19:00.148 "flush": true, 00:19:00.148 "reset": true, 00:19:00.148 "compare": false, 00:19:00.148 "compare_and_write": false, 00:19:00.148 "abort": true, 00:19:00.148 "nvme_admin": false, 00:19:00.148 "nvme_io": false 00:19:00.148 }, 00:19:00.148 "memory_domains": [ 00:19:00.148 { 00:19:00.148 "dma_device_id": "system", 00:19:00.148 "dma_device_type": 1 00:19:00.148 }, 00:19:00.148 { 00:19:00.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.148 "dma_device_type": 2 00:19:00.148 } 00:19:00.148 ], 00:19:00.148 "driver_specific": { 00:19:00.148 "passthru": { 00:19:00.148 "name": "pt1", 00:19:00.148 "base_bdev_name": "malloc1" 00:19:00.148 } 00:19:00.148 } 00:19:00.148 }' 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.148 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.408 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.667 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:00.667 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:00.667 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:00.667 13:47:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:00.667 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:00.667 "name": "pt2", 00:19:00.667 "aliases": [ 00:19:00.667 "00000000-0000-0000-0000-000000000002" 00:19:00.667 ], 00:19:00.667 "product_name": "passthru", 00:19:00.667 "block_size": 512, 00:19:00.667 "num_blocks": 65536, 00:19:00.668 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:00.668 "assigned_rate_limits": { 00:19:00.668 "rw_ios_per_sec": 0, 00:19:00.668 "rw_mbytes_per_sec": 0, 00:19:00.668 "r_mbytes_per_sec": 0, 00:19:00.668 "w_mbytes_per_sec": 0 00:19:00.668 }, 00:19:00.668 "claimed": true, 00:19:00.668 "claim_type": "exclusive_write", 00:19:00.668 "zoned": false, 00:19:00.668 "supported_io_types": { 00:19:00.668 "read": true, 00:19:00.668 "write": true, 00:19:00.668 "unmap": true, 00:19:00.668 "write_zeroes": true, 00:19:00.668 "flush": true, 00:19:00.668 "reset": true, 00:19:00.668 "compare": false, 00:19:00.668 "compare_and_write": false, 00:19:00.668 "abort": true, 00:19:00.668 "nvme_admin": false, 00:19:00.668 "nvme_io": false 00:19:00.668 }, 00:19:00.668 "memory_domains": [ 00:19:00.668 { 00:19:00.668 "dma_device_id": "system", 00:19:00.668 "dma_device_type": 1 00:19:00.668 }, 00:19:00.668 { 00:19:00.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.668 "dma_device_type": 2 00:19:00.668 } 00:19:00.668 ], 00:19:00.668 "driver_specific": { 00:19:00.668 "passthru": { 00:19:00.668 "name": "pt2", 00:19:00.668 "base_bdev_name": "malloc2" 00:19:00.668 } 00:19:00.668 } 00:19:00.668 }' 00:19:00.668 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.668 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:00.927 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.187 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:01.187 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:01.187 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:01.187 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:01.187 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:01.187 "name": "pt3", 00:19:01.187 "aliases": [ 00:19:01.187 "00000000-0000-0000-0000-000000000003" 00:19:01.187 ], 00:19:01.187 "product_name": "passthru", 00:19:01.187 "block_size": 512, 00:19:01.187 "num_blocks": 65536, 00:19:01.187 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:01.187 "assigned_rate_limits": { 00:19:01.187 "rw_ios_per_sec": 0, 00:19:01.187 "rw_mbytes_per_sec": 0, 00:19:01.187 "r_mbytes_per_sec": 0, 00:19:01.187 "w_mbytes_per_sec": 0 00:19:01.187 }, 00:19:01.187 "claimed": true, 00:19:01.187 "claim_type": "exclusive_write", 00:19:01.187 "zoned": false, 00:19:01.187 "supported_io_types": { 00:19:01.187 "read": true, 00:19:01.187 "write": true, 00:19:01.187 "unmap": true, 00:19:01.187 "write_zeroes": true, 00:19:01.187 "flush": true, 00:19:01.187 "reset": true, 00:19:01.187 "compare": false, 00:19:01.187 "compare_and_write": false, 00:19:01.187 "abort": true, 00:19:01.187 "nvme_admin": false, 00:19:01.187 "nvme_io": false 00:19:01.187 }, 00:19:01.187 "memory_domains": [ 00:19:01.187 { 00:19:01.187 "dma_device_id": "system", 00:19:01.187 "dma_device_type": 1 00:19:01.187 }, 00:19:01.187 { 00:19:01.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.187 "dma_device_type": 2 00:19:01.187 } 00:19:01.187 ], 00:19:01.187 "driver_specific": { 00:19:01.187 "passthru": { 00:19:01.187 "name": "pt3", 00:19:01.187 "base_bdev_name": "malloc3" 00:19:01.187 } 00:19:01.187 } 00:19:01.187 }' 00:19:01.187 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.448 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.708 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:01.708 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:01.708 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:01.708 13:47:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:01.708 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:01.708 "name": "pt4", 00:19:01.708 "aliases": [ 00:19:01.708 "00000000-0000-0000-0000-000000000004" 00:19:01.708 ], 00:19:01.708 "product_name": "passthru", 00:19:01.708 "block_size": 512, 00:19:01.708 "num_blocks": 65536, 00:19:01.708 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:01.708 "assigned_rate_limits": { 00:19:01.708 "rw_ios_per_sec": 0, 00:19:01.708 "rw_mbytes_per_sec": 0, 00:19:01.708 "r_mbytes_per_sec": 0, 00:19:01.708 "w_mbytes_per_sec": 0 00:19:01.708 }, 00:19:01.708 "claimed": true, 00:19:01.708 "claim_type": "exclusive_write", 00:19:01.708 "zoned": false, 00:19:01.708 "supported_io_types": { 00:19:01.708 "read": true, 00:19:01.708 "write": true, 00:19:01.708 "unmap": true, 00:19:01.708 "write_zeroes": true, 00:19:01.708 "flush": true, 00:19:01.708 "reset": true, 00:19:01.708 "compare": false, 00:19:01.708 "compare_and_write": false, 00:19:01.708 "abort": true, 00:19:01.708 "nvme_admin": false, 00:19:01.708 "nvme_io": false 00:19:01.708 }, 00:19:01.708 "memory_domains": [ 00:19:01.708 { 00:19:01.708 "dma_device_id": "system", 00:19:01.708 "dma_device_type": 1 00:19:01.708 }, 00:19:01.708 { 00:19:01.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.708 "dma_device_type": 2 00:19:01.708 } 00:19:01.708 ], 00:19:01.708 "driver_specific": { 00:19:01.708 "passthru": { 00:19:01.708 "name": "pt4", 00:19:01.708 "base_bdev_name": "malloc4" 00:19:01.708 } 00:19:01.708 } 00:19:01.708 }' 00:19:01.708 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.969 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:02.230 [2024-06-10 13:47:16.656189] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1f7abc1b-6cbc-4b82-83f0-53c4250a08e6 '!=' 1f7abc1b-6cbc-4b82-83f0-53c4250a08e6 ']' 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:02.230 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:02.490 [2024-06-10 13:47:16.860493] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.490 13:47:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.751 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.751 "name": "raid_bdev1", 00:19:02.751 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:19:02.751 "strip_size_kb": 0, 00:19:02.751 "state": "online", 00:19:02.751 "raid_level": "raid1", 00:19:02.751 "superblock": true, 00:19:02.751 "num_base_bdevs": 4, 00:19:02.751 "num_base_bdevs_discovered": 3, 00:19:02.751 "num_base_bdevs_operational": 3, 00:19:02.751 "base_bdevs_list": [ 00:19:02.751 { 00:19:02.751 "name": null, 00:19:02.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.751 "is_configured": false, 00:19:02.751 "data_offset": 2048, 00:19:02.751 "data_size": 63488 00:19:02.751 }, 00:19:02.751 { 00:19:02.751 "name": "pt2", 00:19:02.751 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:02.751 "is_configured": true, 00:19:02.751 "data_offset": 2048, 00:19:02.751 "data_size": 63488 00:19:02.751 }, 00:19:02.751 { 00:19:02.751 "name": "pt3", 00:19:02.751 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:02.751 "is_configured": true, 00:19:02.751 "data_offset": 2048, 00:19:02.751 "data_size": 63488 00:19:02.751 }, 00:19:02.751 { 00:19:02.751 "name": "pt4", 00:19:02.751 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:02.751 "is_configured": true, 00:19:02.751 "data_offset": 2048, 00:19:02.751 "data_size": 63488 00:19:02.751 } 00:19:02.751 ] 00:19:02.751 }' 00:19:02.751 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.751 13:47:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.323 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:03.323 [2024-06-10 13:47:17.774790] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:03.323 [2024-06-10 13:47:17.774805] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:03.323 [2024-06-10 13:47:17.774840] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:03.323 [2024-06-10 13:47:17.774894] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:03.323 [2024-06-10 13:47:17.774900] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26e9ac0 name raid_bdev1, state offline 00:19:03.323 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:19:03.323 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.584 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:19:03.584 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:19:03.584 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:19:03.584 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:03.584 13:47:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:03.844 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:03.844 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:03.844 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:04.104 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:04.105 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:04.365 [2024-06-10 13:47:18.725151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:04.366 [2024-06-10 13:47:18.725183] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.366 [2024-06-10 13:47:18.725193] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e9d40 00:19:04.366 [2024-06-10 13:47:18.725200] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.366 [2024-06-10 13:47:18.726551] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.366 [2024-06-10 13:47:18.726570] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:04.366 [2024-06-10 13:47:18.726617] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:04.366 [2024-06-10 13:47:18.726635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:04.366 pt2 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.366 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:04.627 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.627 "name": "raid_bdev1", 00:19:04.627 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:19:04.627 "strip_size_kb": 0, 00:19:04.627 "state": "configuring", 00:19:04.627 "raid_level": "raid1", 00:19:04.627 "superblock": true, 00:19:04.627 "num_base_bdevs": 4, 00:19:04.627 "num_base_bdevs_discovered": 1, 00:19:04.627 "num_base_bdevs_operational": 3, 00:19:04.627 "base_bdevs_list": [ 00:19:04.627 { 00:19:04.627 "name": null, 00:19:04.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.627 "is_configured": false, 00:19:04.627 "data_offset": 2048, 00:19:04.627 "data_size": 63488 00:19:04.627 }, 00:19:04.627 { 00:19:04.627 "name": "pt2", 00:19:04.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:04.627 "is_configured": true, 00:19:04.627 "data_offset": 2048, 00:19:04.627 "data_size": 63488 00:19:04.627 }, 00:19:04.627 { 00:19:04.627 "name": null, 00:19:04.627 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:04.627 "is_configured": false, 00:19:04.627 "data_offset": 2048, 00:19:04.627 "data_size": 63488 00:19:04.627 }, 00:19:04.627 { 00:19:04.627 "name": null, 00:19:04.627 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:04.627 "is_configured": false, 00:19:04.627 "data_offset": 2048, 00:19:04.627 "data_size": 63488 00:19:04.627 } 00:19:04.627 ] 00:19:04.627 }' 00:19:04.627 13:47:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.627 13:47:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:05.199 [2024-06-10 13:47:19.655515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:05.199 [2024-06-10 13:47:19.655544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:05.199 [2024-06-10 13:47:19.655555] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27b44c0 00:19:05.199 [2024-06-10 13:47:19.655562] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:05.199 [2024-06-10 13:47:19.655838] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:05.199 [2024-06-10 13:47:19.655849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:05.199 [2024-06-10 13:47:19.655892] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:05.199 [2024-06-10 13:47:19.655904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:05.199 pt3 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.199 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.460 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.460 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.460 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.460 "name": "raid_bdev1", 00:19:05.460 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:19:05.460 "strip_size_kb": 0, 00:19:05.460 "state": "configuring", 00:19:05.460 "raid_level": "raid1", 00:19:05.460 "superblock": true, 00:19:05.460 "num_base_bdevs": 4, 00:19:05.460 "num_base_bdevs_discovered": 2, 00:19:05.460 "num_base_bdevs_operational": 3, 00:19:05.460 "base_bdevs_list": [ 00:19:05.460 { 00:19:05.460 "name": null, 00:19:05.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.460 "is_configured": false, 00:19:05.460 "data_offset": 2048, 00:19:05.460 "data_size": 63488 00:19:05.460 }, 00:19:05.460 { 00:19:05.460 "name": "pt2", 00:19:05.460 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:05.460 "is_configured": true, 00:19:05.460 "data_offset": 2048, 00:19:05.460 "data_size": 63488 00:19:05.460 }, 00:19:05.460 { 00:19:05.460 "name": "pt3", 00:19:05.460 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:05.460 "is_configured": true, 00:19:05.460 "data_offset": 2048, 00:19:05.460 "data_size": 63488 00:19:05.460 }, 00:19:05.460 { 00:19:05.460 "name": null, 00:19:05.460 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:05.460 "is_configured": false, 00:19:05.460 "data_offset": 2048, 00:19:05.460 "data_size": 63488 00:19:05.460 } 00:19:05.460 ] 00:19:05.460 }' 00:19:05.460 13:47:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.460 13:47:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.031 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:06.031 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:06.031 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:19:06.031 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:06.292 [2024-06-10 13:47:20.609950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:06.292 [2024-06-10 13:47:20.609989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:06.292 [2024-06-10 13:47:20.610002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ea060 00:19:06.292 [2024-06-10 13:47:20.610009] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:06.292 [2024-06-10 13:47:20.610299] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:06.292 [2024-06-10 13:47:20.610311] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:06.292 [2024-06-10 13:47:20.610355] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:06.292 [2024-06-10 13:47:20.610368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:06.292 [2024-06-10 13:47:20.610458] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26ec370 00:19:06.292 [2024-06-10 13:47:20.610464] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:06.292 [2024-06-10 13:47:20.610610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ee120 00:19:06.292 [2024-06-10 13:47:20.610715] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26ec370 00:19:06.292 [2024-06-10 13:47:20.610720] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26ec370 00:19:06.292 [2024-06-10 13:47:20.610804] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:06.292 pt4 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.292 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.552 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.552 "name": "raid_bdev1", 00:19:06.552 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:19:06.552 "strip_size_kb": 0, 00:19:06.552 "state": "online", 00:19:06.552 "raid_level": "raid1", 00:19:06.552 "superblock": true, 00:19:06.552 "num_base_bdevs": 4, 00:19:06.552 "num_base_bdevs_discovered": 3, 00:19:06.552 "num_base_bdevs_operational": 3, 00:19:06.552 "base_bdevs_list": [ 00:19:06.552 { 00:19:06.552 "name": null, 00:19:06.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.552 "is_configured": false, 00:19:06.552 "data_offset": 2048, 00:19:06.552 "data_size": 63488 00:19:06.552 }, 00:19:06.552 { 00:19:06.552 "name": "pt2", 00:19:06.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:06.552 "is_configured": true, 00:19:06.552 "data_offset": 2048, 00:19:06.552 "data_size": 63488 00:19:06.552 }, 00:19:06.552 { 00:19:06.552 "name": "pt3", 00:19:06.552 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:06.552 "is_configured": true, 00:19:06.552 "data_offset": 2048, 00:19:06.552 "data_size": 63488 00:19:06.552 }, 00:19:06.552 { 00:19:06.552 "name": "pt4", 00:19:06.552 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:06.552 "is_configured": true, 00:19:06.552 "data_offset": 2048, 00:19:06.552 "data_size": 63488 00:19:06.552 } 00:19:06.552 ] 00:19:06.552 }' 00:19:06.552 13:47:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.552 13:47:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.123 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:07.123 [2024-06-10 13:47:21.584396] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:07.123 [2024-06-10 13:47:21.584411] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:07.123 [2024-06-10 13:47:21.584450] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:07.123 [2024-06-10 13:47:21.584503] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:07.123 [2024-06-10 13:47:21.584509] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26ec370 name raid_bdev1, state offline 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:19:07.384 13:47:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:07.645 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:07.906 [2024-06-10 13:47:22.189912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:07.906 [2024-06-10 13:47:22.189940] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.906 [2024-06-10 13:47:22.189952] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27b5fc0 00:19:07.906 [2024-06-10 13:47:22.189959] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.906 [2024-06-10 13:47:22.191318] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.906 [2024-06-10 13:47:22.191337] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:07.906 [2024-06-10 13:47:22.191381] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:07.906 [2024-06-10 13:47:22.191400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:07.906 [2024-06-10 13:47:22.191472] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:07.906 [2024-06-10 13:47:22.191480] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:07.906 [2024-06-10 13:47:22.191488] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f1af0 name raid_bdev1, state configuring 00:19:07.906 [2024-06-10 13:47:22.191503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:07.906 [2024-06-10 13:47:22.191560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:07.906 pt1 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.906 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.167 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.167 "name": "raid_bdev1", 00:19:08.167 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:19:08.167 "strip_size_kb": 0, 00:19:08.167 "state": "configuring", 00:19:08.167 "raid_level": "raid1", 00:19:08.167 "superblock": true, 00:19:08.167 "num_base_bdevs": 4, 00:19:08.167 "num_base_bdevs_discovered": 2, 00:19:08.167 "num_base_bdevs_operational": 3, 00:19:08.167 "base_bdevs_list": [ 00:19:08.167 { 00:19:08.167 "name": null, 00:19:08.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.167 "is_configured": false, 00:19:08.167 "data_offset": 2048, 00:19:08.167 "data_size": 63488 00:19:08.167 }, 00:19:08.167 { 00:19:08.167 "name": "pt2", 00:19:08.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:08.167 "is_configured": true, 00:19:08.167 "data_offset": 2048, 00:19:08.167 "data_size": 63488 00:19:08.167 }, 00:19:08.167 { 00:19:08.167 "name": "pt3", 00:19:08.167 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:08.167 "is_configured": true, 00:19:08.167 "data_offset": 2048, 00:19:08.167 "data_size": 63488 00:19:08.167 }, 00:19:08.167 { 00:19:08.167 "name": null, 00:19:08.167 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:08.167 "is_configured": false, 00:19:08.167 "data_offset": 2048, 00:19:08.167 "data_size": 63488 00:19:08.167 } 00:19:08.167 ] 00:19:08.167 }' 00:19:08.167 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.167 13:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.739 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:19:08.739 13:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:08.739 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:19:08.739 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:08.999 [2024-06-10 13:47:23.380953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:08.999 [2024-06-10 13:47:23.380985] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.999 [2024-06-10 13:47:23.380995] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ea060 00:19:08.999 [2024-06-10 13:47:23.381002] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.999 [2024-06-10 13:47:23.381284] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.999 [2024-06-10 13:47:23.381295] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:08.999 [2024-06-10 13:47:23.381338] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:08.999 [2024-06-10 13:47:23.381350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:08.999 [2024-06-10 13:47:23.381439] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26eca40 00:19:08.999 [2024-06-10 13:47:23.381445] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:08.999 [2024-06-10 13:47:23.381590] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26e96d0 00:19:08.999 [2024-06-10 13:47:23.381698] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26eca40 00:19:08.999 [2024-06-10 13:47:23.381704] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26eca40 00:19:08.999 [2024-06-10 13:47:23.381779] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.999 pt4 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.999 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.260 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.260 "name": "raid_bdev1", 00:19:09.260 "uuid": "1f7abc1b-6cbc-4b82-83f0-53c4250a08e6", 00:19:09.260 "strip_size_kb": 0, 00:19:09.260 "state": "online", 00:19:09.260 "raid_level": "raid1", 00:19:09.260 "superblock": true, 00:19:09.260 "num_base_bdevs": 4, 00:19:09.260 "num_base_bdevs_discovered": 3, 00:19:09.260 "num_base_bdevs_operational": 3, 00:19:09.260 "base_bdevs_list": [ 00:19:09.260 { 00:19:09.260 "name": null, 00:19:09.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.260 "is_configured": false, 00:19:09.260 "data_offset": 2048, 00:19:09.260 "data_size": 63488 00:19:09.260 }, 00:19:09.260 { 00:19:09.260 "name": "pt2", 00:19:09.260 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:09.260 "is_configured": true, 00:19:09.260 "data_offset": 2048, 00:19:09.260 "data_size": 63488 00:19:09.260 }, 00:19:09.260 { 00:19:09.260 "name": "pt3", 00:19:09.260 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:09.260 "is_configured": true, 00:19:09.260 "data_offset": 2048, 00:19:09.260 "data_size": 63488 00:19:09.260 }, 00:19:09.260 { 00:19:09.260 "name": "pt4", 00:19:09.260 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:09.260 "is_configured": true, 00:19:09.260 "data_offset": 2048, 00:19:09.260 "data_size": 63488 00:19:09.260 } 00:19:09.260 ] 00:19:09.260 }' 00:19:09.260 13:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.260 13:47:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.831 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:19:09.831 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:19:10.092 [2024-06-10 13:47:24.528065] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 1f7abc1b-6cbc-4b82-83f0-53c4250a08e6 '!=' 1f7abc1b-6cbc-4b82-83f0-53c4250a08e6 ']' 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1612474 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 1612474 ']' 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 1612474 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:10.092 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1612474 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1612474' 00:19:10.353 killing process with pid 1612474 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 1612474 00:19:10.353 [2024-06-10 13:47:24.599707] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:10.353 [2024-06-10 13:47:24.599748] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:10.353 [2024-06-10 13:47:24.599804] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:10.353 [2024-06-10 13:47:24.599812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26eca40 name raid_bdev1, state offline 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 1612474 00:19:10.353 [2024-06-10 13:47:24.621249] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:10.353 00:19:10.353 real 0m21.982s 00:19:10.353 user 0m41.024s 00:19:10.353 sys 0m3.212s 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:10.353 13:47:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.353 ************************************ 00:19:10.353 END TEST raid_superblock_test 00:19:10.353 ************************************ 00:19:10.353 13:47:24 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:19:10.353 13:47:24 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:10.353 13:47:24 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:10.353 13:47:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:10.353 ************************************ 00:19:10.353 START TEST raid_read_error_test 00:19:10.353 ************************************ 00:19:10.353 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 read 00:19:10.353 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:10.353 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:10.353 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.npahggHBsm 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1617228 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1617228 /var/tmp/spdk-raid.sock 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 1617228 ']' 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:10.614 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:10.614 13:47:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.614 [2024-06-10 13:47:24.899274] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:19:10.614 [2024-06-10 13:47:24.899328] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617228 ] 00:19:10.614 [2024-06-10 13:47:24.991233] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:10.614 [2024-06-10 13:47:25.068813] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:19:10.875 [2024-06-10 13:47:25.110806] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:10.875 [2024-06-10 13:47:25.110832] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:11.446 13:47:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:11.446 13:47:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:19:11.446 13:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:11.446 13:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:11.707 BaseBdev1_malloc 00:19:11.707 13:47:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:11.707 true 00:19:11.707 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:11.968 [2024-06-10 13:47:26.346399] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:11.968 [2024-06-10 13:47:26.346433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:11.968 [2024-06-10 13:47:26.346445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9bbc90 00:19:11.968 [2024-06-10 13:47:26.346451] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:11.968 [2024-06-10 13:47:26.347887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:11.968 [2024-06-10 13:47:26.347907] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:11.968 BaseBdev1 00:19:11.968 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:11.968 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:12.229 BaseBdev2_malloc 00:19:12.229 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:12.541 true 00:19:12.541 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:12.541 [2024-06-10 13:47:26.937891] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:12.541 [2024-06-10 13:47:26.937919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:12.541 [2024-06-10 13:47:26.937930] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c0400 00:19:12.541 [2024-06-10 13:47:26.937937] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:12.541 [2024-06-10 13:47:26.939182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:12.541 [2024-06-10 13:47:26.939200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:12.541 BaseBdev2 00:19:12.541 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:12.541 13:47:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:12.829 BaseBdev3_malloc 00:19:12.829 13:47:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:13.091 true 00:19:13.091 13:47:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:13.091 [2024-06-10 13:47:27.541221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:13.091 [2024-06-10 13:47:27.541244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:13.091 [2024-06-10 13:47:27.541255] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c2fc0 00:19:13.091 [2024-06-10 13:47:27.541262] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:13.091 [2024-06-10 13:47:27.542480] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:13.091 [2024-06-10 13:47:27.542498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:13.091 BaseBdev3 00:19:13.091 13:47:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:13.091 13:47:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:13.351 BaseBdev4_malloc 00:19:13.351 13:47:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:13.611 true 00:19:13.611 13:47:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:13.872 [2024-06-10 13:47:28.128587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:13.872 [2024-06-10 13:47:28.128619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:13.872 [2024-06-10 13:47:28.128632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c3710 00:19:13.872 [2024-06-10 13:47:28.128639] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:13.872 [2024-06-10 13:47:28.129878] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:13.872 [2024-06-10 13:47:28.129897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:13.872 BaseBdev4 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:13.872 [2024-06-10 13:47:28.321093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:13.872 [2024-06-10 13:47:28.322149] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:13.872 [2024-06-10 13:47:28.322209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:13.872 [2024-06-10 13:47:28.322261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:13.872 [2024-06-10 13:47:28.322453] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9bd3b0 00:19:13.872 [2024-06-10 13:47:28.322461] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:13.872 [2024-06-10 13:47:28.322610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x811380 00:19:13.872 [2024-06-10 13:47:28.322734] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9bd3b0 00:19:13.872 [2024-06-10 13:47:28.322740] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9bd3b0 00:19:13.872 [2024-06-10 13:47:28.322819] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:13.872 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.133 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.133 "name": "raid_bdev1", 00:19:14.133 "uuid": "da4bbc9e-a34a-49e0-bb5f-cdacc565ddfe", 00:19:14.133 "strip_size_kb": 0, 00:19:14.133 "state": "online", 00:19:14.133 "raid_level": "raid1", 00:19:14.133 "superblock": true, 00:19:14.133 "num_base_bdevs": 4, 00:19:14.133 "num_base_bdevs_discovered": 4, 00:19:14.133 "num_base_bdevs_operational": 4, 00:19:14.133 "base_bdevs_list": [ 00:19:14.133 { 00:19:14.133 "name": "BaseBdev1", 00:19:14.133 "uuid": "4599316a-d52b-590d-bc4c-da20a60bcf26", 00:19:14.133 "is_configured": true, 00:19:14.133 "data_offset": 2048, 00:19:14.133 "data_size": 63488 00:19:14.133 }, 00:19:14.133 { 00:19:14.133 "name": "BaseBdev2", 00:19:14.133 "uuid": "36768ae7-867f-5d81-a037-7eb50924feea", 00:19:14.133 "is_configured": true, 00:19:14.133 "data_offset": 2048, 00:19:14.133 "data_size": 63488 00:19:14.133 }, 00:19:14.133 { 00:19:14.133 "name": "BaseBdev3", 00:19:14.133 "uuid": "3de5f5bb-3165-55a5-9757-b6dcd5dbca81", 00:19:14.133 "is_configured": true, 00:19:14.133 "data_offset": 2048, 00:19:14.133 "data_size": 63488 00:19:14.133 }, 00:19:14.133 { 00:19:14.133 "name": "BaseBdev4", 00:19:14.133 "uuid": "18b22805-0a86-5803-9fff-9bcbf0ac125e", 00:19:14.133 "is_configured": true, 00:19:14.133 "data_offset": 2048, 00:19:14.133 "data_size": 63488 00:19:14.133 } 00:19:14.133 ] 00:19:14.133 }' 00:19:14.133 13:47:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.133 13:47:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.704 13:47:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:14.704 13:47:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:14.964 [2024-06-10 13:47:29.199535] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b1510 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.905 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:16.165 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.165 "name": "raid_bdev1", 00:19:16.165 "uuid": "da4bbc9e-a34a-49e0-bb5f-cdacc565ddfe", 00:19:16.165 "strip_size_kb": 0, 00:19:16.165 "state": "online", 00:19:16.165 "raid_level": "raid1", 00:19:16.165 "superblock": true, 00:19:16.165 "num_base_bdevs": 4, 00:19:16.165 "num_base_bdevs_discovered": 4, 00:19:16.165 "num_base_bdevs_operational": 4, 00:19:16.165 "base_bdevs_list": [ 00:19:16.165 { 00:19:16.165 "name": "BaseBdev1", 00:19:16.165 "uuid": "4599316a-d52b-590d-bc4c-da20a60bcf26", 00:19:16.165 "is_configured": true, 00:19:16.165 "data_offset": 2048, 00:19:16.165 "data_size": 63488 00:19:16.165 }, 00:19:16.165 { 00:19:16.165 "name": "BaseBdev2", 00:19:16.165 "uuid": "36768ae7-867f-5d81-a037-7eb50924feea", 00:19:16.165 "is_configured": true, 00:19:16.165 "data_offset": 2048, 00:19:16.165 "data_size": 63488 00:19:16.165 }, 00:19:16.165 { 00:19:16.165 "name": "BaseBdev3", 00:19:16.165 "uuid": "3de5f5bb-3165-55a5-9757-b6dcd5dbca81", 00:19:16.165 "is_configured": true, 00:19:16.165 "data_offset": 2048, 00:19:16.165 "data_size": 63488 00:19:16.165 }, 00:19:16.165 { 00:19:16.165 "name": "BaseBdev4", 00:19:16.165 "uuid": "18b22805-0a86-5803-9fff-9bcbf0ac125e", 00:19:16.165 "is_configured": true, 00:19:16.165 "data_offset": 2048, 00:19:16.165 "data_size": 63488 00:19:16.165 } 00:19:16.165 ] 00:19:16.165 }' 00:19:16.165 13:47:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.165 13:47:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.735 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:16.996 [2024-06-10 13:47:31.231729] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:16.996 [2024-06-10 13:47:31.231760] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:16.996 [2024-06-10 13:47:31.234651] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:16.996 [2024-06-10 13:47:31.234685] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:16.996 [2024-06-10 13:47:31.234785] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:16.996 [2024-06-10 13:47:31.234792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9bd3b0 name raid_bdev1, state offline 00:19:16.996 0 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1617228 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 1617228 ']' 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 1617228 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1617228 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1617228' 00:19:16.996 killing process with pid 1617228 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 1617228 00:19:16.996 [2024-06-10 13:47:31.302931] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 1617228 00:19:16.996 [2024-06-10 13:47:31.320450] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.npahggHBsm 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:16.996 00:19:16.996 real 0m6.633s 00:19:16.996 user 0m10.688s 00:19:16.996 sys 0m0.941s 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:16.996 13:47:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.996 ************************************ 00:19:16.996 END TEST raid_read_error_test 00:19:16.996 ************************************ 00:19:17.257 13:47:31 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:19:17.257 13:47:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:17.257 13:47:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:17.257 13:47:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:17.257 ************************************ 00:19:17.257 START TEST raid_write_error_test 00:19:17.257 ************************************ 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 write 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.iXrtKlq7sZ 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1618631 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1618631 /var/tmp/spdk-raid.sock 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 1618631 ']' 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:17.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:17.257 13:47:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.257 [2024-06-10 13:47:31.608123] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:19:17.257 [2024-06-10 13:47:31.608183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618631 ] 00:19:17.257 [2024-06-10 13:47:31.699859] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:17.517 [2024-06-10 13:47:31.775671] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:19:17.517 [2024-06-10 13:47:31.815491] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:17.517 [2024-06-10 13:47:31.815516] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:18.087 13:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:18.087 13:47:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:19:18.087 13:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:18.087 13:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:18.347 BaseBdev1_malloc 00:19:18.347 13:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:18.607 true 00:19:18.607 13:47:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:18.607 [2024-06-10 13:47:33.039049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:18.607 [2024-06-10 13:47:33.039083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:18.607 [2024-06-10 13:47:33.039095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ccdc90 00:19:18.607 [2024-06-10 13:47:33.039102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:18.607 [2024-06-10 13:47:33.040568] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:18.607 [2024-06-10 13:47:33.040588] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:18.607 BaseBdev1 00:19:18.607 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:18.607 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:18.867 BaseBdev2_malloc 00:19:18.867 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:19.128 true 00:19:19.128 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:19.388 [2024-06-10 13:47:33.630491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:19.388 [2024-06-10 13:47:33.630517] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:19.388 [2024-06-10 13:47:33.630528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd2400 00:19:19.388 [2024-06-10 13:47:33.630534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:19.388 [2024-06-10 13:47:33.631802] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:19.388 [2024-06-10 13:47:33.631821] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:19.388 BaseBdev2 00:19:19.389 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:19.389 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:19.389 BaseBdev3_malloc 00:19:19.389 13:47:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:19.648 true 00:19:19.648 13:47:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:19.909 [2024-06-10 13:47:34.217830] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:19.909 [2024-06-10 13:47:34.217856] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:19.909 [2024-06-10 13:47:34.217868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd4fc0 00:19:19.909 [2024-06-10 13:47:34.217875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:19.909 [2024-06-10 13:47:34.219128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:19.909 [2024-06-10 13:47:34.219146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:19.909 BaseBdev3 00:19:19.909 13:47:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:19.909 13:47:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:20.170 BaseBdev4_malloc 00:19:20.170 13:47:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:20.170 true 00:19:20.170 13:47:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:20.430 [2024-06-10 13:47:34.809171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:20.430 [2024-06-10 13:47:34.809197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:20.430 [2024-06-10 13:47:34.809208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd5710 00:19:20.430 [2024-06-10 13:47:34.809215] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:20.430 [2024-06-10 13:47:34.810476] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:20.430 [2024-06-10 13:47:34.810496] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:20.430 BaseBdev4 00:19:20.430 13:47:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:20.690 [2024-06-10 13:47:35.005687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:20.690 [2024-06-10 13:47:35.006748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:20.690 [2024-06-10 13:47:35.006804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:20.690 [2024-06-10 13:47:35.006856] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:20.690 [2024-06-10 13:47:35.007044] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ccf3b0 00:19:20.690 [2024-06-10 13:47:35.007052] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:20.690 [2024-06-10 13:47:35.007211] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b23380 00:19:20.690 [2024-06-10 13:47:35.007337] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ccf3b0 00:19:20.690 [2024-06-10 13:47:35.007344] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ccf3b0 00:19:20.690 [2024-06-10 13:47:35.007423] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.690 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:20.951 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.951 "name": "raid_bdev1", 00:19:20.951 "uuid": "17c41ab5-b59e-454b-9767-1b0101c2ed71", 00:19:20.951 "strip_size_kb": 0, 00:19:20.951 "state": "online", 00:19:20.951 "raid_level": "raid1", 00:19:20.951 "superblock": true, 00:19:20.951 "num_base_bdevs": 4, 00:19:20.951 "num_base_bdevs_discovered": 4, 00:19:20.951 "num_base_bdevs_operational": 4, 00:19:20.951 "base_bdevs_list": [ 00:19:20.951 { 00:19:20.951 "name": "BaseBdev1", 00:19:20.951 "uuid": "92d17d87-72f5-5c12-835e-5f548c32415e", 00:19:20.951 "is_configured": true, 00:19:20.951 "data_offset": 2048, 00:19:20.951 "data_size": 63488 00:19:20.951 }, 00:19:20.951 { 00:19:20.951 "name": "BaseBdev2", 00:19:20.951 "uuid": "042fe098-6db8-5dcc-8dd4-850d7abbca86", 00:19:20.951 "is_configured": true, 00:19:20.951 "data_offset": 2048, 00:19:20.951 "data_size": 63488 00:19:20.951 }, 00:19:20.951 { 00:19:20.951 "name": "BaseBdev3", 00:19:20.951 "uuid": "f05efaa2-f30f-5e79-aaf3-223509589ffd", 00:19:20.951 "is_configured": true, 00:19:20.951 "data_offset": 2048, 00:19:20.951 "data_size": 63488 00:19:20.951 }, 00:19:20.951 { 00:19:20.951 "name": "BaseBdev4", 00:19:20.951 "uuid": "3c6b8ba5-002f-5801-b4ad-239773f72b96", 00:19:20.951 "is_configured": true, 00:19:20.951 "data_offset": 2048, 00:19:20.951 "data_size": 63488 00:19:20.951 } 00:19:20.951 ] 00:19:20.951 }' 00:19:20.951 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.951 13:47:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.522 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:21.522 13:47:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:21.522 [2024-06-10 13:47:35.892146] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc3510 00:19:22.462 13:47:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:22.722 [2024-06-10 13:47:36.987139] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:19:22.722 [2024-06-10 13:47:36.987193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:22.722 [2024-06-10 13:47:36.987396] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cc3510 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.722 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:22.983 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.983 "name": "raid_bdev1", 00:19:22.983 "uuid": "17c41ab5-b59e-454b-9767-1b0101c2ed71", 00:19:22.983 "strip_size_kb": 0, 00:19:22.983 "state": "online", 00:19:22.983 "raid_level": "raid1", 00:19:22.983 "superblock": true, 00:19:22.983 "num_base_bdevs": 4, 00:19:22.983 "num_base_bdevs_discovered": 3, 00:19:22.983 "num_base_bdevs_operational": 3, 00:19:22.983 "base_bdevs_list": [ 00:19:22.983 { 00:19:22.983 "name": null, 00:19:22.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.983 "is_configured": false, 00:19:22.983 "data_offset": 2048, 00:19:22.983 "data_size": 63488 00:19:22.983 }, 00:19:22.983 { 00:19:22.983 "name": "BaseBdev2", 00:19:22.983 "uuid": "042fe098-6db8-5dcc-8dd4-850d7abbca86", 00:19:22.983 "is_configured": true, 00:19:22.983 "data_offset": 2048, 00:19:22.983 "data_size": 63488 00:19:22.983 }, 00:19:22.983 { 00:19:22.983 "name": "BaseBdev3", 00:19:22.983 "uuid": "f05efaa2-f30f-5e79-aaf3-223509589ffd", 00:19:22.983 "is_configured": true, 00:19:22.983 "data_offset": 2048, 00:19:22.983 "data_size": 63488 00:19:22.983 }, 00:19:22.983 { 00:19:22.983 "name": "BaseBdev4", 00:19:22.983 "uuid": "3c6b8ba5-002f-5801-b4ad-239773f72b96", 00:19:22.983 "is_configured": true, 00:19:22.983 "data_offset": 2048, 00:19:22.983 "data_size": 63488 00:19:22.983 } 00:19:22.983 ] 00:19:22.983 }' 00:19:22.983 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.983 13:47:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:23.554 [2024-06-10 13:47:37.962319] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:23.554 [2024-06-10 13:47:37.962349] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:23.554 [2024-06-10 13:47:37.965106] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:23.554 [2024-06-10 13:47:37.965134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:23.554 [2024-06-10 13:47:37.965219] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:23.554 [2024-06-10 13:47:37.965231] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ccf3b0 name raid_bdev1, state offline 00:19:23.554 0 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1618631 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 1618631 ']' 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 1618631 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:23.554 13:47:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1618631 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1618631' 00:19:23.815 killing process with pid 1618631 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 1618631 00:19:23.815 [2024-06-10 13:47:38.031273] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 1618631 00:19:23.815 [2024-06-10 13:47:38.048567] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.iXrtKlq7sZ 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:23.815 00:19:23.815 real 0m6.650s 00:19:23.815 user 0m10.740s 00:19:23.815 sys 0m0.953s 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:23.815 13:47:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:23.815 ************************************ 00:19:23.815 END TEST raid_write_error_test 00:19:23.815 ************************************ 00:19:23.815 13:47:38 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:19:23.815 13:47:38 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:19:23.815 13:47:38 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:19:23.815 13:47:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:19:23.815 13:47:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:23.815 13:47:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:23.815 ************************************ 00:19:23.815 START TEST raid_rebuild_test 00:19:23.815 ************************************ 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false false true 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1620016 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1620016 /var/tmp/spdk-raid.sock 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 1620016 ']' 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:23.815 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:23.815 13:47:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:24.076 [2024-06-10 13:47:38.328122] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:19:24.076 [2024-06-10 13:47:38.328191] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1620016 ] 00:19:24.076 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:24.076 Zero copy mechanism will not be used. 00:19:24.076 [2024-06-10 13:47:38.420793] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:24.076 [2024-06-10 13:47:38.497996] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:19:24.076 [2024-06-10 13:47:38.545966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:24.076 [2024-06-10 13:47:38.545992] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:25.018 13:47:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:25.018 13:47:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:19:25.018 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:25.018 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:25.018 BaseBdev1_malloc 00:19:25.018 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:25.278 [2024-06-10 13:47:39.569481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:25.278 [2024-06-10 13:47:39.569521] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:25.278 [2024-06-10 13:47:39.569535] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e9900 00:19:25.278 [2024-06-10 13:47:39.569542] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:25.278 [2024-06-10 13:47:39.570915] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:25.278 [2024-06-10 13:47:39.570936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:25.278 BaseBdev1 00:19:25.278 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:25.278 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:25.538 BaseBdev2_malloc 00:19:25.538 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:25.538 [2024-06-10 13:47:39.972889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:25.539 [2024-06-10 13:47:39.972917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:25.539 [2024-06-10 13:47:39.972929] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7ea9c0 00:19:25.539 [2024-06-10 13:47:39.972936] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:25.539 [2024-06-10 13:47:39.974167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:25.539 [2024-06-10 13:47:39.974185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:25.539 BaseBdev2 00:19:25.539 13:47:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:25.799 spare_malloc 00:19:25.799 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:26.059 spare_delay 00:19:26.059 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:26.320 [2024-06-10 13:47:40.572391] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:26.320 [2024-06-10 13:47:40.572420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.320 [2024-06-10 13:47:40.572430] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9986b0 00:19:26.320 [2024-06-10 13:47:40.572437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.320 [2024-06-10 13:47:40.573677] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.320 [2024-06-10 13:47:40.573695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:26.320 spare 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:26.320 [2024-06-10 13:47:40.764887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:26.320 [2024-06-10 13:47:40.765917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.320 [2024-06-10 13:47:40.765974] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x999d20 00:19:26.320 [2024-06-10 13:47:40.765980] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:26.320 [2024-06-10 13:47:40.766138] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e95d0 00:19:26.320 [2024-06-10 13:47:40.766257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x999d20 00:19:26.320 [2024-06-10 13:47:40.766264] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x999d20 00:19:26.320 [2024-06-10 13:47:40.766349] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.320 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.581 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.581 "name": "raid_bdev1", 00:19:26.581 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:26.581 "strip_size_kb": 0, 00:19:26.581 "state": "online", 00:19:26.581 "raid_level": "raid1", 00:19:26.581 "superblock": false, 00:19:26.581 "num_base_bdevs": 2, 00:19:26.581 "num_base_bdevs_discovered": 2, 00:19:26.581 "num_base_bdevs_operational": 2, 00:19:26.581 "base_bdevs_list": [ 00:19:26.581 { 00:19:26.581 "name": "BaseBdev1", 00:19:26.581 "uuid": "331accf8-3871-5cb1-9004-828028bea32f", 00:19:26.581 "is_configured": true, 00:19:26.581 "data_offset": 0, 00:19:26.581 "data_size": 65536 00:19:26.581 }, 00:19:26.581 { 00:19:26.581 "name": "BaseBdev2", 00:19:26.581 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:26.581 "is_configured": true, 00:19:26.581 "data_offset": 0, 00:19:26.581 "data_size": 65536 00:19:26.581 } 00:19:26.581 ] 00:19:26.581 }' 00:19:26.581 13:47:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.581 13:47:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.151 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:27.152 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:27.412 [2024-06-10 13:47:41.735535] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:27.412 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:19:27.412 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.412 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:27.673 13:47:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:27.673 [2024-06-10 13:47:42.144418] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e95d0 00:19:27.933 /dev/nbd0 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:27.933 1+0 records in 00:19:27.933 1+0 records out 00:19:27.933 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318802 s, 12.8 MB/s 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:27.933 13:47:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:19:32.134 65536+0 records in 00:19:32.134 65536+0 records out 00:19:32.134 33554432 bytes (34 MB, 32 MiB) copied, 4.1933 s, 8.0 MB/s 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:32.134 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:32.395 [2024-06-10 13:47:46.620693] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:32.395 [2024-06-10 13:47:46.813223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.395 13:47:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.656 13:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.656 "name": "raid_bdev1", 00:19:32.656 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:32.656 "strip_size_kb": 0, 00:19:32.656 "state": "online", 00:19:32.656 "raid_level": "raid1", 00:19:32.656 "superblock": false, 00:19:32.656 "num_base_bdevs": 2, 00:19:32.656 "num_base_bdevs_discovered": 1, 00:19:32.656 "num_base_bdevs_operational": 1, 00:19:32.656 "base_bdevs_list": [ 00:19:32.656 { 00:19:32.656 "name": null, 00:19:32.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.656 "is_configured": false, 00:19:32.656 "data_offset": 0, 00:19:32.656 "data_size": 65536 00:19:32.656 }, 00:19:32.656 { 00:19:32.656 "name": "BaseBdev2", 00:19:32.656 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:32.656 "is_configured": true, 00:19:32.656 "data_offset": 0, 00:19:32.656 "data_size": 65536 00:19:32.656 } 00:19:32.656 ] 00:19:32.656 }' 00:19:32.656 13:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.656 13:47:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.227 13:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:33.488 [2024-06-10 13:47:47.751604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:33.488 [2024-06-10 13:47:47.755024] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x997ec0 00:19:33.488 [2024-06-10 13:47:47.756679] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:33.488 13:47:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.429 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.689 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:34.689 "name": "raid_bdev1", 00:19:34.689 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:34.689 "strip_size_kb": 0, 00:19:34.689 "state": "online", 00:19:34.689 "raid_level": "raid1", 00:19:34.689 "superblock": false, 00:19:34.689 "num_base_bdevs": 2, 00:19:34.689 "num_base_bdevs_discovered": 2, 00:19:34.689 "num_base_bdevs_operational": 2, 00:19:34.689 "process": { 00:19:34.689 "type": "rebuild", 00:19:34.689 "target": "spare", 00:19:34.689 "progress": { 00:19:34.689 "blocks": 24576, 00:19:34.689 "percent": 37 00:19:34.689 } 00:19:34.689 }, 00:19:34.689 "base_bdevs_list": [ 00:19:34.689 { 00:19:34.689 "name": "spare", 00:19:34.689 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:34.689 "is_configured": true, 00:19:34.689 "data_offset": 0, 00:19:34.689 "data_size": 65536 00:19:34.689 }, 00:19:34.689 { 00:19:34.689 "name": "BaseBdev2", 00:19:34.689 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:34.689 "is_configured": true, 00:19:34.689 "data_offset": 0, 00:19:34.689 "data_size": 65536 00:19:34.689 } 00:19:34.689 ] 00:19:34.689 }' 00:19:34.689 13:47:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:34.689 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:34.689 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:34.689 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:34.689 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:34.949 [2024-06-10 13:47:49.261097] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:34.949 [2024-06-10 13:47:49.265893] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:34.949 [2024-06-10 13:47:49.265925] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:34.949 [2024-06-10 13:47:49.265936] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:34.949 [2024-06-10 13:47:49.265940] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.949 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.209 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.209 "name": "raid_bdev1", 00:19:35.209 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:35.209 "strip_size_kb": 0, 00:19:35.209 "state": "online", 00:19:35.209 "raid_level": "raid1", 00:19:35.209 "superblock": false, 00:19:35.209 "num_base_bdevs": 2, 00:19:35.209 "num_base_bdevs_discovered": 1, 00:19:35.209 "num_base_bdevs_operational": 1, 00:19:35.209 "base_bdevs_list": [ 00:19:35.209 { 00:19:35.209 "name": null, 00:19:35.209 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.209 "is_configured": false, 00:19:35.209 "data_offset": 0, 00:19:35.209 "data_size": 65536 00:19:35.209 }, 00:19:35.209 { 00:19:35.209 "name": "BaseBdev2", 00:19:35.209 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:35.209 "is_configured": true, 00:19:35.209 "data_offset": 0, 00:19:35.209 "data_size": 65536 00:19:35.209 } 00:19:35.209 ] 00:19:35.209 }' 00:19:35.209 13:47:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.209 13:47:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.781 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.041 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:36.041 "name": "raid_bdev1", 00:19:36.041 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:36.041 "strip_size_kb": 0, 00:19:36.041 "state": "online", 00:19:36.041 "raid_level": "raid1", 00:19:36.041 "superblock": false, 00:19:36.041 "num_base_bdevs": 2, 00:19:36.041 "num_base_bdevs_discovered": 1, 00:19:36.041 "num_base_bdevs_operational": 1, 00:19:36.041 "base_bdevs_list": [ 00:19:36.041 { 00:19:36.041 "name": null, 00:19:36.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.041 "is_configured": false, 00:19:36.041 "data_offset": 0, 00:19:36.041 "data_size": 65536 00:19:36.041 }, 00:19:36.041 { 00:19:36.041 "name": "BaseBdev2", 00:19:36.041 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:36.041 "is_configured": true, 00:19:36.041 "data_offset": 0, 00:19:36.041 "data_size": 65536 00:19:36.041 } 00:19:36.041 ] 00:19:36.041 }' 00:19:36.041 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:36.041 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:36.041 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:36.041 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:36.041 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:36.302 [2024-06-10 13:47:50.544412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:36.302 [2024-06-10 13:47:50.547845] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9946d0 00:19:36.302 [2024-06-10 13:47:50.549056] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:36.302 13:47:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.243 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:37.510 "name": "raid_bdev1", 00:19:37.510 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:37.510 "strip_size_kb": 0, 00:19:37.510 "state": "online", 00:19:37.510 "raid_level": "raid1", 00:19:37.510 "superblock": false, 00:19:37.510 "num_base_bdevs": 2, 00:19:37.510 "num_base_bdevs_discovered": 2, 00:19:37.510 "num_base_bdevs_operational": 2, 00:19:37.510 "process": { 00:19:37.510 "type": "rebuild", 00:19:37.510 "target": "spare", 00:19:37.510 "progress": { 00:19:37.510 "blocks": 22528, 00:19:37.510 "percent": 34 00:19:37.510 } 00:19:37.510 }, 00:19:37.510 "base_bdevs_list": [ 00:19:37.510 { 00:19:37.510 "name": "spare", 00:19:37.510 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:37.510 "is_configured": true, 00:19:37.510 "data_offset": 0, 00:19:37.510 "data_size": 65536 00:19:37.510 }, 00:19:37.510 { 00:19:37.510 "name": "BaseBdev2", 00:19:37.510 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:37.510 "is_configured": true, 00:19:37.510 "data_offset": 0, 00:19:37.510 "data_size": 65536 00:19:37.510 } 00:19:37.510 ] 00:19:37.510 }' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=659 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:37.510 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:37.511 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:37.511 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.511 13:47:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.771 13:47:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:37.771 "name": "raid_bdev1", 00:19:37.771 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:37.771 "strip_size_kb": 0, 00:19:37.771 "state": "online", 00:19:37.771 "raid_level": "raid1", 00:19:37.771 "superblock": false, 00:19:37.771 "num_base_bdevs": 2, 00:19:37.771 "num_base_bdevs_discovered": 2, 00:19:37.771 "num_base_bdevs_operational": 2, 00:19:37.771 "process": { 00:19:37.771 "type": "rebuild", 00:19:37.771 "target": "spare", 00:19:37.771 "progress": { 00:19:37.771 "blocks": 28672, 00:19:37.771 "percent": 43 00:19:37.771 } 00:19:37.771 }, 00:19:37.771 "base_bdevs_list": [ 00:19:37.771 { 00:19:37.771 "name": "spare", 00:19:37.771 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:37.771 "is_configured": true, 00:19:37.771 "data_offset": 0, 00:19:37.771 "data_size": 65536 00:19:37.771 }, 00:19:37.771 { 00:19:37.771 "name": "BaseBdev2", 00:19:37.771 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:37.771 "is_configured": true, 00:19:37.771 "data_offset": 0, 00:19:37.771 "data_size": 65536 00:19:37.771 } 00:19:37.771 ] 00:19:37.771 }' 00:19:37.771 13:47:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:37.771 13:47:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:37.771 13:47:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:37.771 13:47:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:37.771 13:47:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.711 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.971 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:38.971 "name": "raid_bdev1", 00:19:38.971 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:38.971 "strip_size_kb": 0, 00:19:38.971 "state": "online", 00:19:38.971 "raid_level": "raid1", 00:19:38.971 "superblock": false, 00:19:38.971 "num_base_bdevs": 2, 00:19:38.971 "num_base_bdevs_discovered": 2, 00:19:38.971 "num_base_bdevs_operational": 2, 00:19:38.971 "process": { 00:19:38.971 "type": "rebuild", 00:19:38.971 "target": "spare", 00:19:38.971 "progress": { 00:19:38.971 "blocks": 55296, 00:19:38.971 "percent": 84 00:19:38.971 } 00:19:38.971 }, 00:19:38.971 "base_bdevs_list": [ 00:19:38.971 { 00:19:38.971 "name": "spare", 00:19:38.971 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:38.971 "is_configured": true, 00:19:38.971 "data_offset": 0, 00:19:38.971 "data_size": 65536 00:19:38.971 }, 00:19:38.971 { 00:19:38.971 "name": "BaseBdev2", 00:19:38.971 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:38.971 "is_configured": true, 00:19:38.971 "data_offset": 0, 00:19:38.971 "data_size": 65536 00:19:38.971 } 00:19:38.971 ] 00:19:38.971 }' 00:19:38.971 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:38.971 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:38.971 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:39.231 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:39.231 13:47:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:39.491 [2024-06-10 13:47:53.768100] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:39.491 [2024-06-10 13:47:53.768146] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:39.491 [2024-06-10 13:47:53.768177] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.063 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:40.325 "name": "raid_bdev1", 00:19:40.325 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:40.325 "strip_size_kb": 0, 00:19:40.325 "state": "online", 00:19:40.325 "raid_level": "raid1", 00:19:40.325 "superblock": false, 00:19:40.325 "num_base_bdevs": 2, 00:19:40.325 "num_base_bdevs_discovered": 2, 00:19:40.325 "num_base_bdevs_operational": 2, 00:19:40.325 "base_bdevs_list": [ 00:19:40.325 { 00:19:40.325 "name": "spare", 00:19:40.325 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:40.325 "is_configured": true, 00:19:40.325 "data_offset": 0, 00:19:40.325 "data_size": 65536 00:19:40.325 }, 00:19:40.325 { 00:19:40.325 "name": "BaseBdev2", 00:19:40.325 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:40.325 "is_configured": true, 00:19:40.325 "data_offset": 0, 00:19:40.325 "data_size": 65536 00:19:40.325 } 00:19:40.325 ] 00:19:40.325 }' 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.325 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.586 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:40.586 "name": "raid_bdev1", 00:19:40.586 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:40.586 "strip_size_kb": 0, 00:19:40.586 "state": "online", 00:19:40.586 "raid_level": "raid1", 00:19:40.586 "superblock": false, 00:19:40.586 "num_base_bdevs": 2, 00:19:40.586 "num_base_bdevs_discovered": 2, 00:19:40.586 "num_base_bdevs_operational": 2, 00:19:40.586 "base_bdevs_list": [ 00:19:40.586 { 00:19:40.586 "name": "spare", 00:19:40.586 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:40.586 "is_configured": true, 00:19:40.586 "data_offset": 0, 00:19:40.586 "data_size": 65536 00:19:40.586 }, 00:19:40.586 { 00:19:40.586 "name": "BaseBdev2", 00:19:40.586 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:40.586 "is_configured": true, 00:19:40.586 "data_offset": 0, 00:19:40.586 "data_size": 65536 00:19:40.586 } 00:19:40.586 ] 00:19:40.586 }' 00:19:40.586 13:47:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.586 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.847 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.847 "name": "raid_bdev1", 00:19:40.847 "uuid": "1d13e5db-3b36-4a76-8aa8-0cddc3d4863e", 00:19:40.847 "strip_size_kb": 0, 00:19:40.847 "state": "online", 00:19:40.847 "raid_level": "raid1", 00:19:40.847 "superblock": false, 00:19:40.847 "num_base_bdevs": 2, 00:19:40.847 "num_base_bdevs_discovered": 2, 00:19:40.847 "num_base_bdevs_operational": 2, 00:19:40.847 "base_bdevs_list": [ 00:19:40.847 { 00:19:40.847 "name": "spare", 00:19:40.847 "uuid": "b5399926-6980-50f1-8673-6aaf6399c1b9", 00:19:40.847 "is_configured": true, 00:19:40.847 "data_offset": 0, 00:19:40.847 "data_size": 65536 00:19:40.847 }, 00:19:40.847 { 00:19:40.847 "name": "BaseBdev2", 00:19:40.847 "uuid": "779dc9d5-24bd-5590-9cba-bf5cc66e65df", 00:19:40.847 "is_configured": true, 00:19:40.847 "data_offset": 0, 00:19:40.847 "data_size": 65536 00:19:40.847 } 00:19:40.847 ] 00:19:40.847 }' 00:19:40.847 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.847 13:47:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:41.419 13:47:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:41.680 [2024-06-10 13:47:55.996750] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:41.680 [2024-06-10 13:47:55.996769] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:41.680 [2024-06-10 13:47:55.996821] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:41.680 [2024-06-10 13:47:55.996868] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:41.680 [2024-06-10 13:47:55.996875] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x999d20 name raid_bdev1, state offline 00:19:41.680 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.680 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:41.941 /dev/nbd0 00:19:41.941 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:42.201 1+0 records in 00:19:42.201 1+0 records out 00:19:42.201 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245258 s, 16.7 MB/s 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:42.201 /dev/nbd1 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:42.201 1+0 records in 00:19:42.201 1+0 records out 00:19:42.201 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240168 s, 17.1 MB/s 00:19:42.201 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:42.461 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:19:42.461 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:42.461 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:42.461 13:47:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:19:42.461 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:42.462 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:42.721 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:42.721 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:42.722 13:47:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:42.722 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:42.722 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:42.722 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:42.722 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:42.722 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:42.722 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:42.981 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:42.981 13:47:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1620016 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 1620016 ']' 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 1620016 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1620016 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1620016' 00:19:42.982 killing process with pid 1620016 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 1620016 00:19:42.982 Received shutdown signal, test time was about 60.000000 seconds 00:19:42.982 00:19:42.982 Latency(us) 00:19:42.982 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:42.982 =================================================================================================================== 00:19:42.982 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:19:42.982 [2024-06-10 13:47:57.253282] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 1620016 00:19:42.982 [2024-06-10 13:47:57.268393] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:19:42.982 00:19:42.982 real 0m19.129s 00:19:42.982 user 0m26.783s 00:19:42.982 sys 0m3.214s 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:42.982 13:47:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.982 ************************************ 00:19:42.982 END TEST raid_rebuild_test 00:19:42.982 ************************************ 00:19:42.982 13:47:57 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:19:42.982 13:47:57 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:19:42.982 13:47:57 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:42.982 13:47:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:43.243 ************************************ 00:19:43.243 START TEST raid_rebuild_test_sb 00:19:43.243 ************************************ 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1623993 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1623993 /var/tmp/spdk-raid.sock 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1623993 ']' 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:43.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:43.243 13:47:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:43.243 [2024-06-10 13:47:57.531700] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:19:43.243 [2024-06-10 13:47:57.531746] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623993 ] 00:19:43.243 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:43.243 Zero copy mechanism will not be used. 00:19:43.243 [2024-06-10 13:47:57.617986] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:43.243 [2024-06-10 13:47:57.683236] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:19:43.505 [2024-06-10 13:47:57.727544] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:43.505 [2024-06-10 13:47:57.727569] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:44.109 13:47:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:44.109 13:47:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:19:44.109 13:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:44.109 13:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:44.109 BaseBdev1_malloc 00:19:44.425 13:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:44.425 [2024-06-10 13:47:58.770616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:44.425 [2024-06-10 13:47:58.770653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.425 [2024-06-10 13:47:58.770668] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2705900 00:19:44.425 [2024-06-10 13:47:58.770675] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.425 [2024-06-10 13:47:58.772126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.425 [2024-06-10 13:47:58.772148] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:44.425 BaseBdev1 00:19:44.425 13:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:44.425 13:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:44.685 BaseBdev2_malloc 00:19:44.685 13:47:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:44.945 [2024-06-10 13:47:59.177871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:44.945 [2024-06-10 13:47:59.177900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:44.945 [2024-06-10 13:47:59.177913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27069c0 00:19:44.945 [2024-06-10 13:47:59.177920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:44.945 [2024-06-10 13:47:59.179200] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:44.945 [2024-06-10 13:47:59.179218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:44.945 BaseBdev2 00:19:44.945 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:44.945 spare_malloc 00:19:44.945 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:45.205 spare_delay 00:19:45.205 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:45.465 [2024-06-10 13:47:59.769299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:45.465 [2024-06-10 13:47:59.769323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.465 [2024-06-10 13:47:59.769333] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b46b0 00:19:45.465 [2024-06-10 13:47:59.769339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.465 [2024-06-10 13:47:59.770784] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.465 [2024-06-10 13:47:59.770804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:45.465 spare 00:19:45.465 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:45.724 [2024-06-10 13:47:59.961810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:45.724 [2024-06-10 13:47:59.962844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:45.724 [2024-06-10 13:47:59.962966] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28b5d20 00:19:45.724 [2024-06-10 13:47:59.962974] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:45.724 [2024-06-10 13:47:59.963119] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27055d0 00:19:45.724 [2024-06-10 13:47:59.963237] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28b5d20 00:19:45.724 [2024-06-10 13:47:59.963243] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28b5d20 00:19:45.724 [2024-06-10 13:47:59.963314] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.724 13:47:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:45.724 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.724 "name": "raid_bdev1", 00:19:45.724 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:45.724 "strip_size_kb": 0, 00:19:45.724 "state": "online", 00:19:45.724 "raid_level": "raid1", 00:19:45.724 "superblock": true, 00:19:45.724 "num_base_bdevs": 2, 00:19:45.724 "num_base_bdevs_discovered": 2, 00:19:45.724 "num_base_bdevs_operational": 2, 00:19:45.724 "base_bdevs_list": [ 00:19:45.724 { 00:19:45.724 "name": "BaseBdev1", 00:19:45.724 "uuid": "89adbac2-53bc-55c5-88de-abccad5d849f", 00:19:45.724 "is_configured": true, 00:19:45.724 "data_offset": 2048, 00:19:45.724 "data_size": 63488 00:19:45.724 }, 00:19:45.724 { 00:19:45.724 "name": "BaseBdev2", 00:19:45.724 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:45.724 "is_configured": true, 00:19:45.724 "data_offset": 2048, 00:19:45.724 "data_size": 63488 00:19:45.724 } 00:19:45.724 ] 00:19:45.724 }' 00:19:45.724 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.724 13:48:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:46.293 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:46.293 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:46.553 [2024-06-10 13:48:00.872598] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:46.553 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:19:46.553 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:46.553 13:48:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:46.812 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:46.812 [2024-06-10 13:48:01.265429] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27055d0 00:19:46.812 /dev/nbd0 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:47.072 1+0 records in 00:19:47.072 1+0 records out 00:19:47.072 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324955 s, 12.6 MB/s 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:47.072 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:47.073 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:47.073 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:47.073 13:48:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:19:51.275 63488+0 records in 00:19:51.275 63488+0 records out 00:19:51.275 32505856 bytes (33 MB, 31 MiB) copied, 3.89254 s, 8.4 MB/s 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:51.275 [2024-06-10 13:48:05.447331] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:51.275 [2024-06-10 13:48:05.635844] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.275 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:51.536 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.536 "name": "raid_bdev1", 00:19:51.536 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:51.536 "strip_size_kb": 0, 00:19:51.536 "state": "online", 00:19:51.536 "raid_level": "raid1", 00:19:51.536 "superblock": true, 00:19:51.536 "num_base_bdevs": 2, 00:19:51.536 "num_base_bdevs_discovered": 1, 00:19:51.536 "num_base_bdevs_operational": 1, 00:19:51.536 "base_bdevs_list": [ 00:19:51.536 { 00:19:51.536 "name": null, 00:19:51.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.536 "is_configured": false, 00:19:51.536 "data_offset": 2048, 00:19:51.536 "data_size": 63488 00:19:51.536 }, 00:19:51.536 { 00:19:51.536 "name": "BaseBdev2", 00:19:51.536 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:51.536 "is_configured": true, 00:19:51.536 "data_offset": 2048, 00:19:51.536 "data_size": 63488 00:19:51.536 } 00:19:51.536 ] 00:19:51.536 }' 00:19:51.536 13:48:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.536 13:48:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.107 13:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:52.107 [2024-06-10 13:48:06.566204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:52.107 [2024-06-10 13:48:06.569593] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b3ec0 00:19:52.107 [2024-06-10 13:48:06.571239] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:52.367 13:48:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.308 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.568 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:53.568 "name": "raid_bdev1", 00:19:53.568 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:53.568 "strip_size_kb": 0, 00:19:53.568 "state": "online", 00:19:53.568 "raid_level": "raid1", 00:19:53.568 "superblock": true, 00:19:53.568 "num_base_bdevs": 2, 00:19:53.568 "num_base_bdevs_discovered": 2, 00:19:53.568 "num_base_bdevs_operational": 2, 00:19:53.568 "process": { 00:19:53.568 "type": "rebuild", 00:19:53.568 "target": "spare", 00:19:53.568 "progress": { 00:19:53.568 "blocks": 24576, 00:19:53.568 "percent": 38 00:19:53.568 } 00:19:53.568 }, 00:19:53.568 "base_bdevs_list": [ 00:19:53.568 { 00:19:53.568 "name": "spare", 00:19:53.568 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:53.568 "is_configured": true, 00:19:53.568 "data_offset": 2048, 00:19:53.568 "data_size": 63488 00:19:53.568 }, 00:19:53.568 { 00:19:53.568 "name": "BaseBdev2", 00:19:53.568 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:53.568 "is_configured": true, 00:19:53.568 "data_offset": 2048, 00:19:53.568 "data_size": 63488 00:19:53.568 } 00:19:53.568 ] 00:19:53.568 }' 00:19:53.568 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:53.568 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:53.568 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:53.568 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:53.568 13:48:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:53.829 [2024-06-10 13:48:08.075878] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:53.829 [2024-06-10 13:48:08.080180] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:53.829 [2024-06-10 13:48:08.080213] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:53.829 [2024-06-10 13:48:08.080223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:53.829 [2024-06-10 13:48:08.080228] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.829 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.089 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.089 "name": "raid_bdev1", 00:19:54.089 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:54.089 "strip_size_kb": 0, 00:19:54.089 "state": "online", 00:19:54.089 "raid_level": "raid1", 00:19:54.089 "superblock": true, 00:19:54.089 "num_base_bdevs": 2, 00:19:54.089 "num_base_bdevs_discovered": 1, 00:19:54.089 "num_base_bdevs_operational": 1, 00:19:54.089 "base_bdevs_list": [ 00:19:54.089 { 00:19:54.089 "name": null, 00:19:54.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:54.089 "is_configured": false, 00:19:54.089 "data_offset": 2048, 00:19:54.089 "data_size": 63488 00:19:54.089 }, 00:19:54.089 { 00:19:54.089 "name": "BaseBdev2", 00:19:54.089 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:54.089 "is_configured": true, 00:19:54.089 "data_offset": 2048, 00:19:54.089 "data_size": 63488 00:19:54.089 } 00:19:54.089 ] 00:19:54.089 }' 00:19:54.089 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.089 13:48:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.662 13:48:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.662 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:54.662 "name": "raid_bdev1", 00:19:54.662 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:54.662 "strip_size_kb": 0, 00:19:54.662 "state": "online", 00:19:54.662 "raid_level": "raid1", 00:19:54.662 "superblock": true, 00:19:54.662 "num_base_bdevs": 2, 00:19:54.662 "num_base_bdevs_discovered": 1, 00:19:54.662 "num_base_bdevs_operational": 1, 00:19:54.662 "base_bdevs_list": [ 00:19:54.662 { 00:19:54.662 "name": null, 00:19:54.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:54.662 "is_configured": false, 00:19:54.662 "data_offset": 2048, 00:19:54.662 "data_size": 63488 00:19:54.662 }, 00:19:54.662 { 00:19:54.662 "name": "BaseBdev2", 00:19:54.662 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:54.662 "is_configured": true, 00:19:54.662 "data_offset": 2048, 00:19:54.662 "data_size": 63488 00:19:54.662 } 00:19:54.662 ] 00:19:54.662 }' 00:19:54.662 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:54.662 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:54.662 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:54.923 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:54.923 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:54.923 [2024-06-10 13:48:09.371334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:54.923 [2024-06-10 13:48:09.374816] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b7200 00:19:54.923 [2024-06-10 13:48:09.376035] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:54.923 13:48:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.308 "name": "raid_bdev1", 00:19:56.308 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:56.308 "strip_size_kb": 0, 00:19:56.308 "state": "online", 00:19:56.308 "raid_level": "raid1", 00:19:56.308 "superblock": true, 00:19:56.308 "num_base_bdevs": 2, 00:19:56.308 "num_base_bdevs_discovered": 2, 00:19:56.308 "num_base_bdevs_operational": 2, 00:19:56.308 "process": { 00:19:56.308 "type": "rebuild", 00:19:56.308 "target": "spare", 00:19:56.308 "progress": { 00:19:56.308 "blocks": 22528, 00:19:56.308 "percent": 35 00:19:56.308 } 00:19:56.308 }, 00:19:56.308 "base_bdevs_list": [ 00:19:56.308 { 00:19:56.308 "name": "spare", 00:19:56.308 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:56.308 "is_configured": true, 00:19:56.308 "data_offset": 2048, 00:19:56.308 "data_size": 63488 00:19:56.308 }, 00:19:56.308 { 00:19:56.308 "name": "BaseBdev2", 00:19:56.308 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:56.308 "is_configured": true, 00:19:56.308 "data_offset": 2048, 00:19:56.308 "data_size": 63488 00:19:56.308 } 00:19:56.308 ] 00:19:56.308 }' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:19:56.308 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=678 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.308 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.570 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.570 "name": "raid_bdev1", 00:19:56.570 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:56.570 "strip_size_kb": 0, 00:19:56.570 "state": "online", 00:19:56.570 "raid_level": "raid1", 00:19:56.570 "superblock": true, 00:19:56.570 "num_base_bdevs": 2, 00:19:56.570 "num_base_bdevs_discovered": 2, 00:19:56.570 "num_base_bdevs_operational": 2, 00:19:56.570 "process": { 00:19:56.570 "type": "rebuild", 00:19:56.570 "target": "spare", 00:19:56.570 "progress": { 00:19:56.570 "blocks": 28672, 00:19:56.570 "percent": 45 00:19:56.570 } 00:19:56.570 }, 00:19:56.570 "base_bdevs_list": [ 00:19:56.570 { 00:19:56.570 "name": "spare", 00:19:56.570 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:56.570 "is_configured": true, 00:19:56.570 "data_offset": 2048, 00:19:56.570 "data_size": 63488 00:19:56.570 }, 00:19:56.570 { 00:19:56.570 "name": "BaseBdev2", 00:19:56.570 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:56.570 "is_configured": true, 00:19:56.570 "data_offset": 2048, 00:19:56.570 "data_size": 63488 00:19:56.570 } 00:19:56.570 ] 00:19:56.570 }' 00:19:56.570 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.570 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:56.570 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.570 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:56.570 13:48:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.509 13:48:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.770 13:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:57.770 "name": "raid_bdev1", 00:19:57.770 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:57.770 "strip_size_kb": 0, 00:19:57.770 "state": "online", 00:19:57.770 "raid_level": "raid1", 00:19:57.770 "superblock": true, 00:19:57.770 "num_base_bdevs": 2, 00:19:57.770 "num_base_bdevs_discovered": 2, 00:19:57.770 "num_base_bdevs_operational": 2, 00:19:57.770 "process": { 00:19:57.770 "type": "rebuild", 00:19:57.770 "target": "spare", 00:19:57.770 "progress": { 00:19:57.770 "blocks": 55296, 00:19:57.770 "percent": 87 00:19:57.770 } 00:19:57.770 }, 00:19:57.770 "base_bdevs_list": [ 00:19:57.770 { 00:19:57.770 "name": "spare", 00:19:57.770 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:57.770 "is_configured": true, 00:19:57.770 "data_offset": 2048, 00:19:57.770 "data_size": 63488 00:19:57.770 }, 00:19:57.770 { 00:19:57.770 "name": "BaseBdev2", 00:19:57.770 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:57.770 "is_configured": true, 00:19:57.770 "data_offset": 2048, 00:19:57.770 "data_size": 63488 00:19:57.770 } 00:19:57.770 ] 00:19:57.770 }' 00:19:57.770 13:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:57.770 13:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:57.770 13:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:58.031 13:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:58.031 13:48:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:58.031 [2024-06-10 13:48:12.494447] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:58.031 [2024-06-10 13:48:12.494495] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:58.031 [2024-06-10 13:48:12.494562] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.971 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.231 "name": "raid_bdev1", 00:19:59.231 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:59.231 "strip_size_kb": 0, 00:19:59.231 "state": "online", 00:19:59.231 "raid_level": "raid1", 00:19:59.231 "superblock": true, 00:19:59.231 "num_base_bdevs": 2, 00:19:59.231 "num_base_bdevs_discovered": 2, 00:19:59.231 "num_base_bdevs_operational": 2, 00:19:59.231 "base_bdevs_list": [ 00:19:59.231 { 00:19:59.231 "name": "spare", 00:19:59.231 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:59.231 "is_configured": true, 00:19:59.231 "data_offset": 2048, 00:19:59.231 "data_size": 63488 00:19:59.231 }, 00:19:59.231 { 00:19:59.231 "name": "BaseBdev2", 00:19:59.231 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:59.231 "is_configured": true, 00:19:59.231 "data_offset": 2048, 00:19:59.231 "data_size": 63488 00:19:59.231 } 00:19:59.231 ] 00:19:59.231 }' 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.231 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.492 "name": "raid_bdev1", 00:19:59.492 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:59.492 "strip_size_kb": 0, 00:19:59.492 "state": "online", 00:19:59.492 "raid_level": "raid1", 00:19:59.492 "superblock": true, 00:19:59.492 "num_base_bdevs": 2, 00:19:59.492 "num_base_bdevs_discovered": 2, 00:19:59.492 "num_base_bdevs_operational": 2, 00:19:59.492 "base_bdevs_list": [ 00:19:59.492 { 00:19:59.492 "name": "spare", 00:19:59.492 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:59.492 "is_configured": true, 00:19:59.492 "data_offset": 2048, 00:19:59.492 "data_size": 63488 00:19:59.492 }, 00:19:59.492 { 00:19:59.492 "name": "BaseBdev2", 00:19:59.492 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:59.492 "is_configured": true, 00:19:59.492 "data_offset": 2048, 00:19:59.492 "data_size": 63488 00:19:59.492 } 00:19:59.492 ] 00:19:59.492 }' 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.492 13:48:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.752 13:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.752 "name": "raid_bdev1", 00:19:59.752 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:19:59.752 "strip_size_kb": 0, 00:19:59.752 "state": "online", 00:19:59.752 "raid_level": "raid1", 00:19:59.752 "superblock": true, 00:19:59.752 "num_base_bdevs": 2, 00:19:59.752 "num_base_bdevs_discovered": 2, 00:19:59.752 "num_base_bdevs_operational": 2, 00:19:59.752 "base_bdevs_list": [ 00:19:59.752 { 00:19:59.752 "name": "spare", 00:19:59.752 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:19:59.752 "is_configured": true, 00:19:59.752 "data_offset": 2048, 00:19:59.752 "data_size": 63488 00:19:59.752 }, 00:19:59.752 { 00:19:59.752 "name": "BaseBdev2", 00:19:59.752 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:19:59.752 "is_configured": true, 00:19:59.752 "data_offset": 2048, 00:19:59.752 "data_size": 63488 00:19:59.752 } 00:19:59.752 ] 00:19:59.752 }' 00:19:59.752 13:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.752 13:48:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.321 13:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:00.581 [2024-06-10 13:48:14.812388] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:00.581 [2024-06-10 13:48:14.812407] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:00.581 [2024-06-10 13:48:14.812459] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:00.581 [2024-06-10 13:48:14.812507] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:00.581 [2024-06-10 13:48:14.812514] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28b5d20 name raid_bdev1, state offline 00:20:00.581 13:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.581 13:48:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:00.581 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:00.841 /dev/nbd0 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:00.841 1+0 records in 00:20:00.841 1+0 records out 00:20:00.841 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024343 s, 16.8 MB/s 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:00.841 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:01.101 /dev/nbd1 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:01.101 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:01.101 1+0 records in 00:20:01.101 1+0 records out 00:20:01.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284707 s, 14.4 MB/s 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:01.102 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:01.362 13:48:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:01.621 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:01.881 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:02.140 [2024-06-10 13:48:16.413462] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:02.140 [2024-06-10 13:48:16.413497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.140 [2024-06-10 13:48:16.413510] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b02a0 00:20:02.140 [2024-06-10 13:48:16.413517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.140 [2024-06-10 13:48:16.414922] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.140 [2024-06-10 13:48:16.414945] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:02.140 [2024-06-10 13:48:16.415007] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:02.140 [2024-06-10 13:48:16.415027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:02.140 [2024-06-10 13:48:16.415110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:02.140 spare 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.140 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.140 [2024-06-10 13:48:16.515411] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x270aa80 00:20:02.140 [2024-06-10 13:48:16.515421] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:02.140 [2024-06-10 13:48:16.515579] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b3d70 00:20:02.140 [2024-06-10 13:48:16.515697] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x270aa80 00:20:02.140 [2024-06-10 13:48:16.515703] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x270aa80 00:20:02.140 [2024-06-10 13:48:16.515789] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.400 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.400 "name": "raid_bdev1", 00:20:02.400 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:02.401 "strip_size_kb": 0, 00:20:02.401 "state": "online", 00:20:02.401 "raid_level": "raid1", 00:20:02.401 "superblock": true, 00:20:02.401 "num_base_bdevs": 2, 00:20:02.401 "num_base_bdevs_discovered": 2, 00:20:02.401 "num_base_bdevs_operational": 2, 00:20:02.401 "base_bdevs_list": [ 00:20:02.401 { 00:20:02.401 "name": "spare", 00:20:02.401 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:20:02.401 "is_configured": true, 00:20:02.401 "data_offset": 2048, 00:20:02.401 "data_size": 63488 00:20:02.401 }, 00:20:02.401 { 00:20:02.401 "name": "BaseBdev2", 00:20:02.401 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:02.401 "is_configured": true, 00:20:02.401 "data_offset": 2048, 00:20:02.401 "data_size": 63488 00:20:02.401 } 00:20:02.401 ] 00:20:02.401 }' 00:20:02.401 13:48:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.401 13:48:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:02.971 "name": "raid_bdev1", 00:20:02.971 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:02.971 "strip_size_kb": 0, 00:20:02.971 "state": "online", 00:20:02.971 "raid_level": "raid1", 00:20:02.971 "superblock": true, 00:20:02.971 "num_base_bdevs": 2, 00:20:02.971 "num_base_bdevs_discovered": 2, 00:20:02.971 "num_base_bdevs_operational": 2, 00:20:02.971 "base_bdevs_list": [ 00:20:02.971 { 00:20:02.971 "name": "spare", 00:20:02.971 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:20:02.971 "is_configured": true, 00:20:02.971 "data_offset": 2048, 00:20:02.971 "data_size": 63488 00:20:02.971 }, 00:20:02.971 { 00:20:02.971 "name": "BaseBdev2", 00:20:02.971 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:02.971 "is_configured": true, 00:20:02.971 "data_offset": 2048, 00:20:02.971 "data_size": 63488 00:20:02.971 } 00:20:02.971 ] 00:20:02.971 }' 00:20:02.971 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:03.231 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:03.490 [2024-06-10 13:48:17.893331] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.490 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.491 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.491 13:48:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.750 13:48:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.750 "name": "raid_bdev1", 00:20:03.750 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:03.750 "strip_size_kb": 0, 00:20:03.750 "state": "online", 00:20:03.750 "raid_level": "raid1", 00:20:03.750 "superblock": true, 00:20:03.750 "num_base_bdevs": 2, 00:20:03.750 "num_base_bdevs_discovered": 1, 00:20:03.750 "num_base_bdevs_operational": 1, 00:20:03.750 "base_bdevs_list": [ 00:20:03.750 { 00:20:03.750 "name": null, 00:20:03.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.750 "is_configured": false, 00:20:03.751 "data_offset": 2048, 00:20:03.751 "data_size": 63488 00:20:03.751 }, 00:20:03.751 { 00:20:03.751 "name": "BaseBdev2", 00:20:03.751 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:03.751 "is_configured": true, 00:20:03.751 "data_offset": 2048, 00:20:03.751 "data_size": 63488 00:20:03.751 } 00:20:03.751 ] 00:20:03.751 }' 00:20:03.751 13:48:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.751 13:48:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.320 13:48:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:04.580 [2024-06-10 13:48:18.871833] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:04.581 [2024-06-10 13:48:18.871960] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:04.581 [2024-06-10 13:48:18.871970] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:04.581 [2024-06-10 13:48:18.871990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:04.581 [2024-06-10 13:48:18.875354] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x270a0c0 00:20:04.581 [2024-06-10 13:48:18.877051] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:04.581 13:48:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.520 13:48:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.780 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:05.780 "name": "raid_bdev1", 00:20:05.780 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:05.780 "strip_size_kb": 0, 00:20:05.780 "state": "online", 00:20:05.780 "raid_level": "raid1", 00:20:05.780 "superblock": true, 00:20:05.780 "num_base_bdevs": 2, 00:20:05.780 "num_base_bdevs_discovered": 2, 00:20:05.780 "num_base_bdevs_operational": 2, 00:20:05.780 "process": { 00:20:05.780 "type": "rebuild", 00:20:05.780 "target": "spare", 00:20:05.780 "progress": { 00:20:05.780 "blocks": 24576, 00:20:05.780 "percent": 38 00:20:05.780 } 00:20:05.780 }, 00:20:05.780 "base_bdevs_list": [ 00:20:05.780 { 00:20:05.780 "name": "spare", 00:20:05.780 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:20:05.780 "is_configured": true, 00:20:05.780 "data_offset": 2048, 00:20:05.780 "data_size": 63488 00:20:05.780 }, 00:20:05.780 { 00:20:05.780 "name": "BaseBdev2", 00:20:05.780 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:05.780 "is_configured": true, 00:20:05.780 "data_offset": 2048, 00:20:05.780 "data_size": 63488 00:20:05.780 } 00:20:05.780 ] 00:20:05.780 }' 00:20:05.780 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:05.780 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:05.780 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:05.780 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:05.780 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:06.040 [2024-06-10 13:48:20.381683] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:06.040 [2024-06-10 13:48:20.385947] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:06.040 [2024-06-10 13:48:20.385981] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:06.040 [2024-06-10 13:48:20.385991] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:06.040 [2024-06-10 13:48:20.385996] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.040 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:06.300 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.300 "name": "raid_bdev1", 00:20:06.300 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:06.300 "strip_size_kb": 0, 00:20:06.300 "state": "online", 00:20:06.300 "raid_level": "raid1", 00:20:06.300 "superblock": true, 00:20:06.300 "num_base_bdevs": 2, 00:20:06.300 "num_base_bdevs_discovered": 1, 00:20:06.300 "num_base_bdevs_operational": 1, 00:20:06.300 "base_bdevs_list": [ 00:20:06.300 { 00:20:06.300 "name": null, 00:20:06.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.300 "is_configured": false, 00:20:06.300 "data_offset": 2048, 00:20:06.300 "data_size": 63488 00:20:06.300 }, 00:20:06.300 { 00:20:06.300 "name": "BaseBdev2", 00:20:06.300 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:06.300 "is_configured": true, 00:20:06.300 "data_offset": 2048, 00:20:06.300 "data_size": 63488 00:20:06.300 } 00:20:06.300 ] 00:20:06.300 }' 00:20:06.300 13:48:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.300 13:48:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:06.869 13:48:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:07.129 [2024-06-10 13:48:21.368399] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:07.129 [2024-06-10 13:48:21.368436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.129 [2024-06-10 13:48:21.368457] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b04e0 00:20:07.129 [2024-06-10 13:48:21.368464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.129 [2024-06-10 13:48:21.368798] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.129 [2024-06-10 13:48:21.368811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:07.129 [2024-06-10 13:48:21.368874] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:07.129 [2024-06-10 13:48:21.368881] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:07.129 [2024-06-10 13:48:21.368887] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:07.129 [2024-06-10 13:48:21.368899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:07.129 [2024-06-10 13:48:21.372191] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b6000 00:20:07.129 [2024-06-10 13:48:21.373407] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:07.129 spare 00:20:07.129 13:48:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.068 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.328 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:08.328 "name": "raid_bdev1", 00:20:08.328 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:08.328 "strip_size_kb": 0, 00:20:08.328 "state": "online", 00:20:08.328 "raid_level": "raid1", 00:20:08.328 "superblock": true, 00:20:08.328 "num_base_bdevs": 2, 00:20:08.328 "num_base_bdevs_discovered": 2, 00:20:08.328 "num_base_bdevs_operational": 2, 00:20:08.328 "process": { 00:20:08.328 "type": "rebuild", 00:20:08.328 "target": "spare", 00:20:08.328 "progress": { 00:20:08.328 "blocks": 24576, 00:20:08.328 "percent": 38 00:20:08.328 } 00:20:08.328 }, 00:20:08.328 "base_bdevs_list": [ 00:20:08.328 { 00:20:08.328 "name": "spare", 00:20:08.328 "uuid": "7a24b642-8956-5209-a20b-3f81a0454197", 00:20:08.328 "is_configured": true, 00:20:08.328 "data_offset": 2048, 00:20:08.328 "data_size": 63488 00:20:08.328 }, 00:20:08.328 { 00:20:08.328 "name": "BaseBdev2", 00:20:08.328 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:08.328 "is_configured": true, 00:20:08.328 "data_offset": 2048, 00:20:08.328 "data_size": 63488 00:20:08.328 } 00:20:08.328 ] 00:20:08.328 }' 00:20:08.328 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:08.328 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:08.328 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:08.328 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:08.328 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:08.588 [2024-06-10 13:48:22.878010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:08.588 [2024-06-10 13:48:22.882278] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:08.588 [2024-06-10 13:48:22.882309] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:08.588 [2024-06-10 13:48:22.882319] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:08.588 [2024-06-10 13:48:22.882323] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.588 13:48:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.847 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.847 "name": "raid_bdev1", 00:20:08.847 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:08.847 "strip_size_kb": 0, 00:20:08.847 "state": "online", 00:20:08.847 "raid_level": "raid1", 00:20:08.847 "superblock": true, 00:20:08.847 "num_base_bdevs": 2, 00:20:08.847 "num_base_bdevs_discovered": 1, 00:20:08.847 "num_base_bdevs_operational": 1, 00:20:08.847 "base_bdevs_list": [ 00:20:08.847 { 00:20:08.847 "name": null, 00:20:08.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.847 "is_configured": false, 00:20:08.847 "data_offset": 2048, 00:20:08.847 "data_size": 63488 00:20:08.847 }, 00:20:08.847 { 00:20:08.847 "name": "BaseBdev2", 00:20:08.847 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:08.847 "is_configured": true, 00:20:08.847 "data_offset": 2048, 00:20:08.847 "data_size": 63488 00:20:08.847 } 00:20:08.847 ] 00:20:08.847 }' 00:20:08.848 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.848 13:48:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:09.419 "name": "raid_bdev1", 00:20:09.419 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:09.419 "strip_size_kb": 0, 00:20:09.419 "state": "online", 00:20:09.419 "raid_level": "raid1", 00:20:09.419 "superblock": true, 00:20:09.419 "num_base_bdevs": 2, 00:20:09.419 "num_base_bdevs_discovered": 1, 00:20:09.419 "num_base_bdevs_operational": 1, 00:20:09.419 "base_bdevs_list": [ 00:20:09.419 { 00:20:09.419 "name": null, 00:20:09.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:09.419 "is_configured": false, 00:20:09.419 "data_offset": 2048, 00:20:09.419 "data_size": 63488 00:20:09.419 }, 00:20:09.419 { 00:20:09.419 "name": "BaseBdev2", 00:20:09.419 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:09.419 "is_configured": true, 00:20:09.419 "data_offset": 2048, 00:20:09.419 "data_size": 63488 00:20:09.419 } 00:20:09.419 ] 00:20:09.419 }' 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:09.419 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:09.679 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:09.679 13:48:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:09.679 13:48:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:09.939 [2024-06-10 13:48:24.301813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:09.939 [2024-06-10 13:48:24.301845] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.939 [2024-06-10 13:48:24.301862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b2990 00:20:09.939 [2024-06-10 13:48:24.301868] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.939 [2024-06-10 13:48:24.302181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.939 [2024-06-10 13:48:24.302194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:09.939 [2024-06-10 13:48:24.302243] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:09.939 [2024-06-10 13:48:24.302250] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:09.939 [2024-06-10 13:48:24.302255] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:09.939 BaseBdev1 00:20:09.939 13:48:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.879 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.139 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.139 "name": "raid_bdev1", 00:20:11.139 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:11.139 "strip_size_kb": 0, 00:20:11.139 "state": "online", 00:20:11.139 "raid_level": "raid1", 00:20:11.139 "superblock": true, 00:20:11.139 "num_base_bdevs": 2, 00:20:11.139 "num_base_bdevs_discovered": 1, 00:20:11.139 "num_base_bdevs_operational": 1, 00:20:11.139 "base_bdevs_list": [ 00:20:11.139 { 00:20:11.139 "name": null, 00:20:11.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.139 "is_configured": false, 00:20:11.139 "data_offset": 2048, 00:20:11.139 "data_size": 63488 00:20:11.139 }, 00:20:11.139 { 00:20:11.139 "name": "BaseBdev2", 00:20:11.139 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:11.139 "is_configured": true, 00:20:11.139 "data_offset": 2048, 00:20:11.139 "data_size": 63488 00:20:11.139 } 00:20:11.139 ] 00:20:11.139 }' 00:20:11.139 13:48:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.139 13:48:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.710 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:11.971 "name": "raid_bdev1", 00:20:11.971 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:11.971 "strip_size_kb": 0, 00:20:11.971 "state": "online", 00:20:11.971 "raid_level": "raid1", 00:20:11.971 "superblock": true, 00:20:11.971 "num_base_bdevs": 2, 00:20:11.971 "num_base_bdevs_discovered": 1, 00:20:11.971 "num_base_bdevs_operational": 1, 00:20:11.971 "base_bdevs_list": [ 00:20:11.971 { 00:20:11.971 "name": null, 00:20:11.971 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.971 "is_configured": false, 00:20:11.971 "data_offset": 2048, 00:20:11.971 "data_size": 63488 00:20:11.971 }, 00:20:11.971 { 00:20:11.971 "name": "BaseBdev2", 00:20:11.971 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:11.971 "is_configured": true, 00:20:11.971 "data_offset": 2048, 00:20:11.971 "data_size": 63488 00:20:11.971 } 00:20:11.971 ] 00:20:11.971 }' 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:11.971 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:12.231 [2024-06-10 13:48:26.579627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:12.231 [2024-06-10 13:48:26.579733] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:12.231 [2024-06-10 13:48:26.579742] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:12.231 request: 00:20:12.231 { 00:20:12.231 "raid_bdev": "raid_bdev1", 00:20:12.231 "base_bdev": "BaseBdev1", 00:20:12.231 "method": "bdev_raid_add_base_bdev", 00:20:12.231 "req_id": 1 00:20:12.231 } 00:20:12.231 Got JSON-RPC error response 00:20:12.231 response: 00:20:12.231 { 00:20:12.231 "code": -22, 00:20:12.231 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:12.231 } 00:20:12.231 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:20:12.231 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:20:12.231 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:20:12.231 13:48:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:20:12.231 13:48:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.276 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.537 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.537 "name": "raid_bdev1", 00:20:13.537 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:13.537 "strip_size_kb": 0, 00:20:13.537 "state": "online", 00:20:13.537 "raid_level": "raid1", 00:20:13.537 "superblock": true, 00:20:13.537 "num_base_bdevs": 2, 00:20:13.537 "num_base_bdevs_discovered": 1, 00:20:13.537 "num_base_bdevs_operational": 1, 00:20:13.537 "base_bdevs_list": [ 00:20:13.537 { 00:20:13.537 "name": null, 00:20:13.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.537 "is_configured": false, 00:20:13.537 "data_offset": 2048, 00:20:13.537 "data_size": 63488 00:20:13.537 }, 00:20:13.537 { 00:20:13.537 "name": "BaseBdev2", 00:20:13.537 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:13.537 "is_configured": true, 00:20:13.537 "data_offset": 2048, 00:20:13.537 "data_size": 63488 00:20:13.537 } 00:20:13.537 ] 00:20:13.537 }' 00:20:13.537 13:48:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.537 13:48:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:14.110 "name": "raid_bdev1", 00:20:14.110 "uuid": "ae91a31e-24ee-48f3-80f8-88e7cb962746", 00:20:14.110 "strip_size_kb": 0, 00:20:14.110 "state": "online", 00:20:14.110 "raid_level": "raid1", 00:20:14.110 "superblock": true, 00:20:14.110 "num_base_bdevs": 2, 00:20:14.110 "num_base_bdevs_discovered": 1, 00:20:14.110 "num_base_bdevs_operational": 1, 00:20:14.110 "base_bdevs_list": [ 00:20:14.110 { 00:20:14.110 "name": null, 00:20:14.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.110 "is_configured": false, 00:20:14.110 "data_offset": 2048, 00:20:14.110 "data_size": 63488 00:20:14.110 }, 00:20:14.110 { 00:20:14.110 "name": "BaseBdev2", 00:20:14.110 "uuid": "7e1009ad-e78d-57eb-9cd8-cc6c4e1ce439", 00:20:14.110 "is_configured": true, 00:20:14.110 "data_offset": 2048, 00:20:14.110 "data_size": 63488 00:20:14.110 } 00:20:14.110 ] 00:20:14.110 }' 00:20:14.110 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1623993 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1623993 ']' 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 1623993 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1623993 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1623993' 00:20:14.371 killing process with pid 1623993 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 1623993 00:20:14.371 Received shutdown signal, test time was about 60.000000 seconds 00:20:14.371 00:20:14.371 Latency(us) 00:20:14.371 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:14.371 =================================================================================================================== 00:20:14.371 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:14.371 [2024-06-10 13:48:28.722994] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:14.371 [2024-06-10 13:48:28.723070] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:14.371 [2024-06-10 13:48:28.723106] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:14.371 [2024-06-10 13:48:28.723113] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x270aa80 name raid_bdev1, state offline 00:20:14.371 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 1623993 00:20:14.371 [2024-06-10 13:48:28.738740] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:20:14.632 00:20:14.632 real 0m31.402s 00:20:14.632 user 0m46.976s 00:20:14.632 sys 0m4.231s 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:14.632 ************************************ 00:20:14.632 END TEST raid_rebuild_test_sb 00:20:14.632 ************************************ 00:20:14.632 13:48:28 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:20:14.632 13:48:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:20:14.632 13:48:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:14.632 13:48:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:14.632 ************************************ 00:20:14.632 START TEST raid_rebuild_test_io 00:20:14.632 ************************************ 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false true true 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:14.632 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1630958 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1630958 /var/tmp/spdk-raid.sock 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 1630958 ']' 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:14.633 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:14.633 13:48:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:14.633 [2024-06-10 13:48:29.011751] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:20:14.633 [2024-06-10 13:48:29.011797] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630958 ] 00:20:14.633 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:14.633 Zero copy mechanism will not be used. 00:20:14.633 [2024-06-10 13:48:29.097358] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:14.894 [2024-06-10 13:48:29.163713] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:20:14.894 [2024-06-10 13:48:29.210203] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:14.894 [2024-06-10 13:48:29.210225] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:15.465 13:48:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:15.465 13:48:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:20:15.465 13:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:15.465 13:48:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:15.726 BaseBdev1_malloc 00:20:15.726 13:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:15.987 [2024-06-10 13:48:30.229234] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:15.987 [2024-06-10 13:48:30.229275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.987 [2024-06-10 13:48:30.229292] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24ef900 00:20:15.987 [2024-06-10 13:48:30.229299] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.987 [2024-06-10 13:48:30.230730] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.987 [2024-06-10 13:48:30.230751] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:15.987 BaseBdev1 00:20:15.987 13:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:15.987 13:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:15.987 BaseBdev2_malloc 00:20:15.987 13:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:16.247 [2024-06-10 13:48:30.624277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:16.247 [2024-06-10 13:48:30.624306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.247 [2024-06-10 13:48:30.624317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f09c0 00:20:16.247 [2024-06-10 13:48:30.624324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.247 [2024-06-10 13:48:30.625580] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.247 [2024-06-10 13:48:30.625599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:16.247 BaseBdev2 00:20:16.247 13:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:16.507 spare_malloc 00:20:16.507 13:48:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:16.767 spare_delay 00:20:16.767 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:16.767 [2024-06-10 13:48:31.227701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:16.767 [2024-06-10 13:48:31.227729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.767 [2024-06-10 13:48:31.227741] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x269e6b0 00:20:16.767 [2024-06-10 13:48:31.227748] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.767 [2024-06-10 13:48:31.229001] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.767 [2024-06-10 13:48:31.229020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:16.767 spare 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:17.028 [2024-06-10 13:48:31.420204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:17.028 [2024-06-10 13:48:31.421240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:17.028 [2024-06-10 13:48:31.421297] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x269fd20 00:20:17.028 [2024-06-10 13:48:31.421303] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:17.028 [2024-06-10 13:48:31.421465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24ef5d0 00:20:17.028 [2024-06-10 13:48:31.421575] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x269fd20 00:20:17.028 [2024-06-10 13:48:31.421582] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x269fd20 00:20:17.028 [2024-06-10 13:48:31.421665] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.028 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.288 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.288 "name": "raid_bdev1", 00:20:17.288 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:17.288 "strip_size_kb": 0, 00:20:17.288 "state": "online", 00:20:17.288 "raid_level": "raid1", 00:20:17.288 "superblock": false, 00:20:17.288 "num_base_bdevs": 2, 00:20:17.288 "num_base_bdevs_discovered": 2, 00:20:17.288 "num_base_bdevs_operational": 2, 00:20:17.288 "base_bdevs_list": [ 00:20:17.288 { 00:20:17.288 "name": "BaseBdev1", 00:20:17.288 "uuid": "51508917-508b-5491-9b9f-7fc0edd05ea4", 00:20:17.288 "is_configured": true, 00:20:17.288 "data_offset": 0, 00:20:17.288 "data_size": 65536 00:20:17.288 }, 00:20:17.288 { 00:20:17.288 "name": "BaseBdev2", 00:20:17.288 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:17.288 "is_configured": true, 00:20:17.288 "data_offset": 0, 00:20:17.288 "data_size": 65536 00:20:17.288 } 00:20:17.288 ] 00:20:17.288 }' 00:20:17.288 13:48:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.288 13:48:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:17.858 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:17.858 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:18.118 [2024-06-10 13:48:32.374796] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:18.118 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:18.118 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.118 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:18.378 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:18.378 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:18.378 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:18.378 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:18.378 [2024-06-10 13:48:32.696903] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26998e0 00:20:18.378 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:18.378 Zero copy mechanism will not be used. 00:20:18.378 Running I/O for 60 seconds... 00:20:18.378 [2024-06-10 13:48:32.786477] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:18.378 [2024-06-10 13:48:32.793419] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26998e0 00:20:18.378 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:18.378 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.379 13:48:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.640 13:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.640 "name": "raid_bdev1", 00:20:18.640 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:18.640 "strip_size_kb": 0, 00:20:18.640 "state": "online", 00:20:18.640 "raid_level": "raid1", 00:20:18.640 "superblock": false, 00:20:18.640 "num_base_bdevs": 2, 00:20:18.640 "num_base_bdevs_discovered": 1, 00:20:18.640 "num_base_bdevs_operational": 1, 00:20:18.640 "base_bdevs_list": [ 00:20:18.640 { 00:20:18.640 "name": null, 00:20:18.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.640 "is_configured": false, 00:20:18.640 "data_offset": 0, 00:20:18.640 "data_size": 65536 00:20:18.640 }, 00:20:18.640 { 00:20:18.640 "name": "BaseBdev2", 00:20:18.640 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:18.640 "is_configured": true, 00:20:18.640 "data_offset": 0, 00:20:18.640 "data_size": 65536 00:20:18.640 } 00:20:18.640 ] 00:20:18.640 }' 00:20:18.640 13:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.640 13:48:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:19.210 13:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:19.471 [2024-06-10 13:48:33.729439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:19.471 13:48:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:19.471 [2024-06-10 13:48:33.771069] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26216e0 00:20:19.471 [2024-06-10 13:48:33.772789] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:19.471 [2024-06-10 13:48:33.895119] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:19.471 [2024-06-10 13:48:33.895330] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:19.731 [2024-06-10 13:48:34.111453] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:19.731 [2024-06-10 13:48:34.111579] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:20.300 [2024-06-10 13:48:34.480611] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:20.300 [2024-06-10 13:48:34.596989] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:20.300 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.560 [2024-06-10 13:48:34.821079] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:20.560 [2024-06-10 13:48:34.821364] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:20.560 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:20.560 "name": "raid_bdev1", 00:20:20.560 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:20.560 "strip_size_kb": 0, 00:20:20.560 "state": "online", 00:20:20.560 "raid_level": "raid1", 00:20:20.560 "superblock": false, 00:20:20.560 "num_base_bdevs": 2, 00:20:20.560 "num_base_bdevs_discovered": 2, 00:20:20.560 "num_base_bdevs_operational": 2, 00:20:20.560 "process": { 00:20:20.561 "type": "rebuild", 00:20:20.561 "target": "spare", 00:20:20.561 "progress": { 00:20:20.561 "blocks": 14336, 00:20:20.561 "percent": 21 00:20:20.561 } 00:20:20.561 }, 00:20:20.561 "base_bdevs_list": [ 00:20:20.561 { 00:20:20.561 "name": "spare", 00:20:20.561 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:20.561 "is_configured": true, 00:20:20.561 "data_offset": 0, 00:20:20.561 "data_size": 65536 00:20:20.561 }, 00:20:20.561 { 00:20:20.561 "name": "BaseBdev2", 00:20:20.561 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:20.561 "is_configured": true, 00:20:20.561 "data_offset": 0, 00:20:20.561 "data_size": 65536 00:20:20.561 } 00:20:20.561 ] 00:20:20.561 }' 00:20:20.561 13:48:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:20.561 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:20.561 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:20.561 [2024-06-10 13:48:35.029938] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:20.561 [2024-06-10 13:48:35.030070] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:20.820 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:20.820 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:20.820 [2024-06-10 13:48:35.224599] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:21.080 [2024-06-10 13:48:35.383602] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:21.080 [2024-06-10 13:48:35.391777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:21.080 [2024-06-10 13:48:35.391795] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:21.080 [2024-06-10 13:48:35.391801] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:21.080 [2024-06-10 13:48:35.409596] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x26998e0 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.080 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.340 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.340 "name": "raid_bdev1", 00:20:21.340 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:21.340 "strip_size_kb": 0, 00:20:21.340 "state": "online", 00:20:21.340 "raid_level": "raid1", 00:20:21.340 "superblock": false, 00:20:21.340 "num_base_bdevs": 2, 00:20:21.340 "num_base_bdevs_discovered": 1, 00:20:21.340 "num_base_bdevs_operational": 1, 00:20:21.340 "base_bdevs_list": [ 00:20:21.340 { 00:20:21.340 "name": null, 00:20:21.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.340 "is_configured": false, 00:20:21.340 "data_offset": 0, 00:20:21.340 "data_size": 65536 00:20:21.340 }, 00:20:21.340 { 00:20:21.340 "name": "BaseBdev2", 00:20:21.340 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:21.340 "is_configured": true, 00:20:21.340 "data_offset": 0, 00:20:21.340 "data_size": 65536 00:20:21.340 } 00:20:21.340 ] 00:20:21.340 }' 00:20:21.340 13:48:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.340 13:48:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.912 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.172 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:22.172 "name": "raid_bdev1", 00:20:22.172 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:22.172 "strip_size_kb": 0, 00:20:22.172 "state": "online", 00:20:22.172 "raid_level": "raid1", 00:20:22.172 "superblock": false, 00:20:22.172 "num_base_bdevs": 2, 00:20:22.172 "num_base_bdevs_discovered": 1, 00:20:22.172 "num_base_bdevs_operational": 1, 00:20:22.172 "base_bdevs_list": [ 00:20:22.172 { 00:20:22.172 "name": null, 00:20:22.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.172 "is_configured": false, 00:20:22.172 "data_offset": 0, 00:20:22.172 "data_size": 65536 00:20:22.172 }, 00:20:22.172 { 00:20:22.172 "name": "BaseBdev2", 00:20:22.172 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:22.172 "is_configured": true, 00:20:22.172 "data_offset": 0, 00:20:22.172 "data_size": 65536 00:20:22.172 } 00:20:22.172 ] 00:20:22.172 }' 00:20:22.172 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:22.172 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:22.172 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:22.172 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:22.172 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:22.432 [2024-06-10 13:48:36.708573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:22.432 13:48:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:22.432 [2024-06-10 13:48:36.763798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2621ee0 00:20:22.432 [2024-06-10 13:48:36.764999] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:22.432 [2024-06-10 13:48:36.873346] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:22.432 [2024-06-10 13:48:36.873595] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:22.692 [2024-06-10 13:48:37.090978] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:22.692 [2024-06-10 13:48:37.091093] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:23.264 [2024-06-10 13:48:37.429804] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:23.264 [2024-06-10 13:48:37.659621] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:23.525 "name": "raid_bdev1", 00:20:23.525 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:23.525 "strip_size_kb": 0, 00:20:23.525 "state": "online", 00:20:23.525 "raid_level": "raid1", 00:20:23.525 "superblock": false, 00:20:23.525 "num_base_bdevs": 2, 00:20:23.525 "num_base_bdevs_discovered": 2, 00:20:23.525 "num_base_bdevs_operational": 2, 00:20:23.525 "process": { 00:20:23.525 "type": "rebuild", 00:20:23.525 "target": "spare", 00:20:23.525 "progress": { 00:20:23.525 "blocks": 12288, 00:20:23.525 "percent": 18 00:20:23.525 } 00:20:23.525 }, 00:20:23.525 "base_bdevs_list": [ 00:20:23.525 { 00:20:23.525 "name": "spare", 00:20:23.525 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:23.525 "is_configured": true, 00:20:23.525 "data_offset": 0, 00:20:23.525 "data_size": 65536 00:20:23.525 }, 00:20:23.525 { 00:20:23.525 "name": "BaseBdev2", 00:20:23.525 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:23.525 "is_configured": true, 00:20:23.525 "data_offset": 0, 00:20:23.525 "data_size": 65536 00:20:23.525 } 00:20:23.525 ] 00:20:23.525 }' 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:23.525 [2024-06-10 13:48:37.983723] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:23.525 [2024-06-10 13:48:37.984002] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:23.525 13:48:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=706 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:23.785 "name": "raid_bdev1", 00:20:23.785 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:23.785 "strip_size_kb": 0, 00:20:23.785 "state": "online", 00:20:23.785 "raid_level": "raid1", 00:20:23.785 "superblock": false, 00:20:23.785 "num_base_bdevs": 2, 00:20:23.785 "num_base_bdevs_discovered": 2, 00:20:23.785 "num_base_bdevs_operational": 2, 00:20:23.785 "process": { 00:20:23.785 "type": "rebuild", 00:20:23.785 "target": "spare", 00:20:23.785 "progress": { 00:20:23.785 "blocks": 16384, 00:20:23.785 "percent": 25 00:20:23.785 } 00:20:23.785 }, 00:20:23.785 "base_bdevs_list": [ 00:20:23.785 { 00:20:23.785 "name": "spare", 00:20:23.785 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:23.785 "is_configured": true, 00:20:23.785 "data_offset": 0, 00:20:23.785 "data_size": 65536 00:20:23.785 }, 00:20:23.785 { 00:20:23.785 "name": "BaseBdev2", 00:20:23.785 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:23.785 "is_configured": true, 00:20:23.785 "data_offset": 0, 00:20:23.785 "data_size": 65536 00:20:23.785 } 00:20:23.785 ] 00:20:23.785 }' 00:20:23.785 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:24.046 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:24.046 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:24.046 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:24.046 13:48:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:24.046 [2024-06-10 13:48:38.394846] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:24.306 [2024-06-10 13:48:38.610393] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:24.306 [2024-06-10 13:48:38.610504] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:24.566 [2024-06-10 13:48:38.942102] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:20:24.566 [2024-06-10 13:48:38.942360] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:20:24.826 [2024-06-10 13:48:39.158295] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:25.089 "name": "raid_bdev1", 00:20:25.089 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:25.089 "strip_size_kb": 0, 00:20:25.089 "state": "online", 00:20:25.089 "raid_level": "raid1", 00:20:25.089 "superblock": false, 00:20:25.089 "num_base_bdevs": 2, 00:20:25.089 "num_base_bdevs_discovered": 2, 00:20:25.089 "num_base_bdevs_operational": 2, 00:20:25.089 "process": { 00:20:25.089 "type": "rebuild", 00:20:25.089 "target": "spare", 00:20:25.089 "progress": { 00:20:25.089 "blocks": 30720, 00:20:25.089 "percent": 46 00:20:25.089 } 00:20:25.089 }, 00:20:25.089 "base_bdevs_list": [ 00:20:25.089 { 00:20:25.089 "name": "spare", 00:20:25.089 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:25.089 "is_configured": true, 00:20:25.089 "data_offset": 0, 00:20:25.089 "data_size": 65536 00:20:25.089 }, 00:20:25.089 { 00:20:25.089 "name": "BaseBdev2", 00:20:25.089 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:25.089 "is_configured": true, 00:20:25.089 "data_offset": 0, 00:20:25.089 "data_size": 65536 00:20:25.089 } 00:20:25.089 ] 00:20:25.089 }' 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:25.089 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:25.350 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:25.350 13:48:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:25.350 [2024-06-10 13:48:39.663807] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:25.611 [2024-06-10 13:48:39.987453] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:26.182 [2024-06-10 13:48:40.456992] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.182 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:26.442 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:26.442 "name": "raid_bdev1", 00:20:26.442 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:26.442 "strip_size_kb": 0, 00:20:26.442 "state": "online", 00:20:26.442 "raid_level": "raid1", 00:20:26.442 "superblock": false, 00:20:26.442 "num_base_bdevs": 2, 00:20:26.442 "num_base_bdevs_discovered": 2, 00:20:26.442 "num_base_bdevs_operational": 2, 00:20:26.442 "process": { 00:20:26.442 "type": "rebuild", 00:20:26.442 "target": "spare", 00:20:26.442 "progress": { 00:20:26.442 "blocks": 49152, 00:20:26.442 "percent": 75 00:20:26.442 } 00:20:26.442 }, 00:20:26.442 "base_bdevs_list": [ 00:20:26.442 { 00:20:26.442 "name": "spare", 00:20:26.442 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:26.442 "is_configured": true, 00:20:26.442 "data_offset": 0, 00:20:26.442 "data_size": 65536 00:20:26.442 }, 00:20:26.442 { 00:20:26.442 "name": "BaseBdev2", 00:20:26.442 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:26.442 "is_configured": true, 00:20:26.442 "data_offset": 0, 00:20:26.442 "data_size": 65536 00:20:26.442 } 00:20:26.442 ] 00:20:26.442 }' 00:20:26.442 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:26.442 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:26.442 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:26.442 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:26.442 13:48:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:26.442 [2024-06-10 13:48:40.905357] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:27.013 [2024-06-10 13:48:41.239782] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:20:27.273 [2024-06-10 13:48:41.687229] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:27.533 [2024-06-10 13:48:41.787517] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:27.533 [2024-06-10 13:48:41.788688] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.533 13:48:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:27.794 "name": "raid_bdev1", 00:20:27.794 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:27.794 "strip_size_kb": 0, 00:20:27.794 "state": "online", 00:20:27.794 "raid_level": "raid1", 00:20:27.794 "superblock": false, 00:20:27.794 "num_base_bdevs": 2, 00:20:27.794 "num_base_bdevs_discovered": 2, 00:20:27.794 "num_base_bdevs_operational": 2, 00:20:27.794 "base_bdevs_list": [ 00:20:27.794 { 00:20:27.794 "name": "spare", 00:20:27.794 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:27.794 "is_configured": true, 00:20:27.794 "data_offset": 0, 00:20:27.794 "data_size": 65536 00:20:27.794 }, 00:20:27.794 { 00:20:27.794 "name": "BaseBdev2", 00:20:27.794 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:27.794 "is_configured": true, 00:20:27.794 "data_offset": 0, 00:20:27.794 "data_size": 65536 00:20:27.794 } 00:20:27.794 ] 00:20:27.794 }' 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.794 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:28.055 "name": "raid_bdev1", 00:20:28.055 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:28.055 "strip_size_kb": 0, 00:20:28.055 "state": "online", 00:20:28.055 "raid_level": "raid1", 00:20:28.055 "superblock": false, 00:20:28.055 "num_base_bdevs": 2, 00:20:28.055 "num_base_bdevs_discovered": 2, 00:20:28.055 "num_base_bdevs_operational": 2, 00:20:28.055 "base_bdevs_list": [ 00:20:28.055 { 00:20:28.055 "name": "spare", 00:20:28.055 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:28.055 "is_configured": true, 00:20:28.055 "data_offset": 0, 00:20:28.055 "data_size": 65536 00:20:28.055 }, 00:20:28.055 { 00:20:28.055 "name": "BaseBdev2", 00:20:28.055 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:28.055 "is_configured": true, 00:20:28.055 "data_offset": 0, 00:20:28.055 "data_size": 65536 00:20:28.055 } 00:20:28.055 ] 00:20:28.055 }' 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.055 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.316 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.316 "name": "raid_bdev1", 00:20:28.316 "uuid": "ba442100-0427-4f0a-9a31-98c07151bd71", 00:20:28.316 "strip_size_kb": 0, 00:20:28.316 "state": "online", 00:20:28.316 "raid_level": "raid1", 00:20:28.316 "superblock": false, 00:20:28.316 "num_base_bdevs": 2, 00:20:28.316 "num_base_bdevs_discovered": 2, 00:20:28.316 "num_base_bdevs_operational": 2, 00:20:28.316 "base_bdevs_list": [ 00:20:28.316 { 00:20:28.316 "name": "spare", 00:20:28.316 "uuid": "17ee19db-66ff-5d4f-9fd4-282ca642435b", 00:20:28.316 "is_configured": true, 00:20:28.316 "data_offset": 0, 00:20:28.316 "data_size": 65536 00:20:28.316 }, 00:20:28.316 { 00:20:28.316 "name": "BaseBdev2", 00:20:28.316 "uuid": "a49eb059-e29f-537e-a2b6-766bb2929d67", 00:20:28.316 "is_configured": true, 00:20:28.316 "data_offset": 0, 00:20:28.316 "data_size": 65536 00:20:28.316 } 00:20:28.316 ] 00:20:28.316 }' 00:20:28.316 13:48:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.316 13:48:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:28.886 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:29.146 [2024-06-10 13:48:43.424683] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:29.146 [2024-06-10 13:48:43.424707] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:29.146 00:20:29.146 Latency(us) 00:20:29.146 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:29.146 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:29.146 raid_bdev1 : 10.80 106.47 319.42 0.00 0.00 12913.49 254.29 115343.36 00:20:29.146 =================================================================================================================== 00:20:29.146 Total : 106.47 319.42 0.00 0.00 12913.49 254.29 115343.36 00:20:29.146 [2024-06-10 13:48:43.528371] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.146 [2024-06-10 13:48:43.528396] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:29.146 [2024-06-10 13:48:43.528460] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:29.146 [2024-06-10 13:48:43.528467] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x269fd20 name raid_bdev1, state offline 00:20:29.146 0 00:20:29.146 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.146 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:29.405 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:29.406 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:29.406 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:29.665 /dev/nbd0 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:29.666 1+0 records in 00:20:29.666 1+0 records out 00:20:29.666 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239887 s, 17.1 MB/s 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:29.666 13:48:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:29.666 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:20:29.926 /dev/nbd1 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:29.926 1+0 records in 00:20:29.926 1+0 records out 00:20:29.926 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250836 s, 16.3 MB/s 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:29.926 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:30.186 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1630958 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 1630958 ']' 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 1630958 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1630958 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1630958' 00:20:30.447 killing process with pid 1630958 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 1630958 00:20:30.447 Received shutdown signal, test time was about 12.085766 seconds 00:20:30.447 00:20:30.447 Latency(us) 00:20:30.447 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:30.447 =================================================================================================================== 00:20:30.447 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:30.447 [2024-06-10 13:48:44.812755] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:30.447 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 1630958 00:20:30.447 [2024-06-10 13:48:44.824648] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:30.707 13:48:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:30.707 00:20:30.707 real 0m16.017s 00:20:30.707 user 0m24.467s 00:20:30.707 sys 0m1.889s 00:20:30.707 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:30.707 13:48:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:30.707 ************************************ 00:20:30.707 END TEST raid_rebuild_test_io 00:20:30.707 ************************************ 00:20:30.707 13:48:44 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:20:30.707 13:48:44 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:20:30.707 13:48:44 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:30.707 13:48:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:30.707 ************************************ 00:20:30.707 START TEST raid_rebuild_test_sb_io 00:20:30.707 ************************************ 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true true true 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:30.707 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1634491 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1634491 /var/tmp/spdk-raid.sock 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 1634491 ']' 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:30.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:30.708 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:30.708 [2024-06-10 13:48:45.094084] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:20:30.708 [2024-06-10 13:48:45.094129] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1634491 ] 00:20:30.708 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:30.708 Zero copy mechanism will not be used. 00:20:30.708 [2024-06-10 13:48:45.181533] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:30.967 [2024-06-10 13:48:45.245905] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:20:30.967 [2024-06-10 13:48:45.285503] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:30.967 [2024-06-10 13:48:45.285539] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:31.537 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:31.537 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:20:31.537 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:31.537 13:48:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:31.797 BaseBdev1_malloc 00:20:31.797 13:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:32.057 [2024-06-10 13:48:46.324670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:32.057 [2024-06-10 13:48:46.324707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:32.057 [2024-06-10 13:48:46.324721] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15d6900 00:20:32.057 [2024-06-10 13:48:46.324728] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:32.057 [2024-06-10 13:48:46.326126] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:32.057 [2024-06-10 13:48:46.326146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:32.057 BaseBdev1 00:20:32.057 13:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:32.057 13:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:32.057 BaseBdev2_malloc 00:20:32.057 13:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:32.317 [2024-06-10 13:48:46.703819] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:32.317 [2024-06-10 13:48:46.703846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:32.317 [2024-06-10 13:48:46.703857] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15d79c0 00:20:32.317 [2024-06-10 13:48:46.703863] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:32.317 [2024-06-10 13:48:46.705108] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:32.317 [2024-06-10 13:48:46.705127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:32.317 BaseBdev2 00:20:32.317 13:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:32.576 spare_malloc 00:20:32.576 13:48:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:32.836 spare_delay 00:20:32.836 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:32.836 [2024-06-10 13:48:47.307439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:32.836 [2024-06-10 13:48:47.307469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:32.836 [2024-06-10 13:48:47.307480] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17856b0 00:20:32.836 [2024-06-10 13:48:47.307487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:32.836 [2024-06-10 13:48:47.308736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:32.836 [2024-06-10 13:48:47.308756] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:33.096 spare 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:20:33.096 [2024-06-10 13:48:47.507967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:33.096 [2024-06-10 13:48:47.509005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:33.096 [2024-06-10 13:48:47.509127] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1786d20 00:20:33.096 [2024-06-10 13:48:47.509136] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:33.096 [2024-06-10 13:48:47.509291] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15d65d0 00:20:33.096 [2024-06-10 13:48:47.509403] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1786d20 00:20:33.096 [2024-06-10 13:48:47.509409] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1786d20 00:20:33.096 [2024-06-10 13:48:47.509479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.096 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.355 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.355 "name": "raid_bdev1", 00:20:33.355 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:33.355 "strip_size_kb": 0, 00:20:33.355 "state": "online", 00:20:33.355 "raid_level": "raid1", 00:20:33.356 "superblock": true, 00:20:33.356 "num_base_bdevs": 2, 00:20:33.356 "num_base_bdevs_discovered": 2, 00:20:33.356 "num_base_bdevs_operational": 2, 00:20:33.356 "base_bdevs_list": [ 00:20:33.356 { 00:20:33.356 "name": "BaseBdev1", 00:20:33.356 "uuid": "3df2701c-2b85-5b92-9dfb-a8dbd7296647", 00:20:33.356 "is_configured": true, 00:20:33.356 "data_offset": 2048, 00:20:33.356 "data_size": 63488 00:20:33.356 }, 00:20:33.356 { 00:20:33.356 "name": "BaseBdev2", 00:20:33.356 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:33.356 "is_configured": true, 00:20:33.356 "data_offset": 2048, 00:20:33.356 "data_size": 63488 00:20:33.356 } 00:20:33.356 ] 00:20:33.356 }' 00:20:33.356 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.356 13:48:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:33.925 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:33.925 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:34.184 [2024-06-10 13:48:48.478590] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:34.184 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:34.184 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.184 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:34.444 [2024-06-10 13:48:48.796681] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15d5de0 00:20:34.444 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:34.444 Zero copy mechanism will not be used. 00:20:34.444 Running I/O for 60 seconds... 00:20:34.444 [2024-06-10 13:48:48.887408] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:34.444 [2024-06-10 13:48:48.887585] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15d5de0 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.444 13:48:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.704 13:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.704 "name": "raid_bdev1", 00:20:34.704 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:34.704 "strip_size_kb": 0, 00:20:34.704 "state": "online", 00:20:34.704 "raid_level": "raid1", 00:20:34.704 "superblock": true, 00:20:34.704 "num_base_bdevs": 2, 00:20:34.704 "num_base_bdevs_discovered": 1, 00:20:34.704 "num_base_bdevs_operational": 1, 00:20:34.704 "base_bdevs_list": [ 00:20:34.704 { 00:20:34.704 "name": null, 00:20:34.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.704 "is_configured": false, 00:20:34.704 "data_offset": 2048, 00:20:34.704 "data_size": 63488 00:20:34.704 }, 00:20:34.704 { 00:20:34.704 "name": "BaseBdev2", 00:20:34.704 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:34.704 "is_configured": true, 00:20:34.704 "data_offset": 2048, 00:20:34.704 "data_size": 63488 00:20:34.704 } 00:20:34.704 ] 00:20:34.704 }' 00:20:34.704 13:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.704 13:48:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:35.273 13:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:35.534 [2024-06-10 13:48:49.884437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:35.534 13:48:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:35.534 [2024-06-10 13:48:49.947989] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15e9070 00:20:35.534 [2024-06-10 13:48:49.949711] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:35.794 [2024-06-10 13:48:50.050380] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:35.794 [2024-06-10 13:48:50.050673] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:35.794 [2024-06-10 13:48:50.167737] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:35.794 [2024-06-10 13:48:50.167867] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:36.733 [2024-06-10 13:48:50.892477] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:36.733 [2024-06-10 13:48:50.892591] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.733 13:48:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.733 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:36.733 "name": "raid_bdev1", 00:20:36.733 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:36.733 "strip_size_kb": 0, 00:20:36.733 "state": "online", 00:20:36.733 "raid_level": "raid1", 00:20:36.733 "superblock": true, 00:20:36.733 "num_base_bdevs": 2, 00:20:36.733 "num_base_bdevs_discovered": 2, 00:20:36.733 "num_base_bdevs_operational": 2, 00:20:36.733 "process": { 00:20:36.733 "type": "rebuild", 00:20:36.733 "target": "spare", 00:20:36.733 "progress": { 00:20:36.733 "blocks": 18432, 00:20:36.733 "percent": 29 00:20:36.733 } 00:20:36.733 }, 00:20:36.733 "base_bdevs_list": [ 00:20:36.733 { 00:20:36.733 "name": "spare", 00:20:36.733 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:36.733 "is_configured": true, 00:20:36.733 "data_offset": 2048, 00:20:36.733 "data_size": 63488 00:20:36.733 }, 00:20:36.733 { 00:20:36.733 "name": "BaseBdev2", 00:20:36.733 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:36.733 "is_configured": true, 00:20:36.733 "data_offset": 2048, 00:20:36.733 "data_size": 63488 00:20:36.733 } 00:20:36.733 ] 00:20:36.733 }' 00:20:36.733 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:36.733 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:36.733 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:36.993 [2024-06-10 13:48:51.219952] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:36.994 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:36.994 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:36.994 [2024-06-10 13:48:51.335629] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:36.994 [2024-06-10 13:48:51.335755] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:36.994 [2024-06-10 13:48:51.426011] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:37.254 [2024-06-10 13:48:51.566861] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:37.254 [2024-06-10 13:48:51.575264] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:37.254 [2024-06-10 13:48:51.575282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:37.254 [2024-06-10 13:48:51.575292] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:37.254 [2024-06-10 13:48:51.593240] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15d5de0 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.254 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.514 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.514 "name": "raid_bdev1", 00:20:37.514 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:37.514 "strip_size_kb": 0, 00:20:37.514 "state": "online", 00:20:37.514 "raid_level": "raid1", 00:20:37.514 "superblock": true, 00:20:37.514 "num_base_bdevs": 2, 00:20:37.514 "num_base_bdevs_discovered": 1, 00:20:37.514 "num_base_bdevs_operational": 1, 00:20:37.514 "base_bdevs_list": [ 00:20:37.514 { 00:20:37.514 "name": null, 00:20:37.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.514 "is_configured": false, 00:20:37.514 "data_offset": 2048, 00:20:37.514 "data_size": 63488 00:20:37.514 }, 00:20:37.514 { 00:20:37.514 "name": "BaseBdev2", 00:20:37.514 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:37.514 "is_configured": true, 00:20:37.514 "data_offset": 2048, 00:20:37.514 "data_size": 63488 00:20:37.514 } 00:20:37.514 ] 00:20:37.514 }' 00:20:37.514 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.514 13:48:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.083 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.343 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:38.343 "name": "raid_bdev1", 00:20:38.343 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:38.343 "strip_size_kb": 0, 00:20:38.343 "state": "online", 00:20:38.343 "raid_level": "raid1", 00:20:38.343 "superblock": true, 00:20:38.343 "num_base_bdevs": 2, 00:20:38.343 "num_base_bdevs_discovered": 1, 00:20:38.343 "num_base_bdevs_operational": 1, 00:20:38.343 "base_bdevs_list": [ 00:20:38.343 { 00:20:38.343 "name": null, 00:20:38.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.343 "is_configured": false, 00:20:38.343 "data_offset": 2048, 00:20:38.343 "data_size": 63488 00:20:38.343 }, 00:20:38.343 { 00:20:38.343 "name": "BaseBdev2", 00:20:38.343 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:38.343 "is_configured": true, 00:20:38.343 "data_offset": 2048, 00:20:38.343 "data_size": 63488 00:20:38.343 } 00:20:38.343 ] 00:20:38.343 }' 00:20:38.343 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:38.343 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:38.343 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:38.343 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:38.343 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:38.603 [2024-06-10 13:48:52.867104] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.603 13:48:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:38.603 [2024-06-10 13:48:52.916600] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e0260 00:20:38.603 [2024-06-10 13:48:52.917807] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:38.603 [2024-06-10 13:48:53.044921] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:38.603 [2024-06-10 13:48:53.045199] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:38.923 [2024-06-10 13:48:53.261413] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:38.923 [2024-06-10 13:48:53.261569] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:39.528 [2024-06-10 13:48:53.739742] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.528 13:48:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.789 [2024-06-10 13:48:54.008212] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.789 "name": "raid_bdev1", 00:20:39.789 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:39.789 "strip_size_kb": 0, 00:20:39.789 "state": "online", 00:20:39.789 "raid_level": "raid1", 00:20:39.789 "superblock": true, 00:20:39.789 "num_base_bdevs": 2, 00:20:39.789 "num_base_bdevs_discovered": 2, 00:20:39.789 "num_base_bdevs_operational": 2, 00:20:39.789 "process": { 00:20:39.789 "type": "rebuild", 00:20:39.789 "target": "spare", 00:20:39.789 "progress": { 00:20:39.789 "blocks": 14336, 00:20:39.789 "percent": 22 00:20:39.789 } 00:20:39.789 }, 00:20:39.789 "base_bdevs_list": [ 00:20:39.789 { 00:20:39.789 "name": "spare", 00:20:39.789 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:39.789 "is_configured": true, 00:20:39.789 "data_offset": 2048, 00:20:39.789 "data_size": 63488 00:20:39.789 }, 00:20:39.789 { 00:20:39.789 "name": "BaseBdev2", 00:20:39.789 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:39.789 "is_configured": true, 00:20:39.789 "data_offset": 2048, 00:20:39.789 "data_size": 63488 00:20:39.789 } 00:20:39.789 ] 00:20:39.789 }' 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.789 [2024-06-10 13:48:54.139323] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:39.789 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=722 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.789 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.049 [2024-06-10 13:48:54.365315] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:40.049 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:40.049 "name": "raid_bdev1", 00:20:40.049 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:40.049 "strip_size_kb": 0, 00:20:40.049 "state": "online", 00:20:40.049 "raid_level": "raid1", 00:20:40.049 "superblock": true, 00:20:40.049 "num_base_bdevs": 2, 00:20:40.049 "num_base_bdevs_discovered": 2, 00:20:40.049 "num_base_bdevs_operational": 2, 00:20:40.049 "process": { 00:20:40.049 "type": "rebuild", 00:20:40.049 "target": "spare", 00:20:40.049 "progress": { 00:20:40.049 "blocks": 20480, 00:20:40.049 "percent": 32 00:20:40.049 } 00:20:40.049 }, 00:20:40.049 "base_bdevs_list": [ 00:20:40.049 { 00:20:40.049 "name": "spare", 00:20:40.049 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:40.049 "is_configured": true, 00:20:40.049 "data_offset": 2048, 00:20:40.049 "data_size": 63488 00:20:40.049 }, 00:20:40.049 { 00:20:40.049 "name": "BaseBdev2", 00:20:40.049 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:40.049 "is_configured": true, 00:20:40.049 "data_offset": 2048, 00:20:40.049 "data_size": 63488 00:20:40.049 } 00:20:40.049 ] 00:20:40.049 }' 00:20:40.049 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:40.049 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:40.049 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:40.049 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:40.049 13:48:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:40.049 [2024-06-10 13:48:54.489127] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:40.049 [2024-06-10 13:48:54.489276] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:40.990 [2024-06-10 13:48:55.178963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:20:40.990 [2024-06-10 13:48:55.302996] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.250 [2024-06-10 13:48:55.521088] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:41.250 [2024-06-10 13:48:55.521328] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:41.250 [2024-06-10 13:48:55.630146] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:41.250 "name": "raid_bdev1", 00:20:41.250 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:41.250 "strip_size_kb": 0, 00:20:41.250 "state": "online", 00:20:41.250 "raid_level": "raid1", 00:20:41.250 "superblock": true, 00:20:41.250 "num_base_bdevs": 2, 00:20:41.250 "num_base_bdevs_discovered": 2, 00:20:41.250 "num_base_bdevs_operational": 2, 00:20:41.250 "process": { 00:20:41.250 "type": "rebuild", 00:20:41.250 "target": "spare", 00:20:41.250 "progress": { 00:20:41.250 "blocks": 40960, 00:20:41.250 "percent": 64 00:20:41.250 } 00:20:41.250 }, 00:20:41.250 "base_bdevs_list": [ 00:20:41.250 { 00:20:41.250 "name": "spare", 00:20:41.250 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:41.250 "is_configured": true, 00:20:41.250 "data_offset": 2048, 00:20:41.250 "data_size": 63488 00:20:41.250 }, 00:20:41.250 { 00:20:41.250 "name": "BaseBdev2", 00:20:41.250 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:41.250 "is_configured": true, 00:20:41.250 "data_offset": 2048, 00:20:41.250 "data_size": 63488 00:20:41.250 } 00:20:41.250 ] 00:20:41.250 }' 00:20:41.250 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:41.511 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:41.511 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:41.511 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:41.511 13:48:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.452 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.712 [2024-06-10 13:48:56.938546] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:42.712 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:42.712 "name": "raid_bdev1", 00:20:42.712 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:42.712 "strip_size_kb": 0, 00:20:42.712 "state": "online", 00:20:42.712 "raid_level": "raid1", 00:20:42.712 "superblock": true, 00:20:42.712 "num_base_bdevs": 2, 00:20:42.712 "num_base_bdevs_discovered": 2, 00:20:42.712 "num_base_bdevs_operational": 2, 00:20:42.712 "process": { 00:20:42.712 "type": "rebuild", 00:20:42.712 "target": "spare", 00:20:42.712 "progress": { 00:20:42.712 "blocks": 63488, 00:20:42.712 "percent": 100 00:20:42.712 } 00:20:42.712 }, 00:20:42.712 "base_bdevs_list": [ 00:20:42.712 { 00:20:42.712 "name": "spare", 00:20:42.712 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:42.712 "is_configured": true, 00:20:42.712 "data_offset": 2048, 00:20:42.712 "data_size": 63488 00:20:42.712 }, 00:20:42.712 { 00:20:42.712 "name": "BaseBdev2", 00:20:42.712 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:42.712 "is_configured": true, 00:20:42.712 "data_offset": 2048, 00:20:42.712 "data_size": 63488 00:20:42.712 } 00:20:42.712 ] 00:20:42.712 }' 00:20:42.712 13:48:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:42.712 13:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:42.712 13:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:42.712 [2024-06-10 13:48:57.045643] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:42.712 [2024-06-10 13:48:57.047286] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.712 13:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:42.712 13:48:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.654 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.914 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.914 "name": "raid_bdev1", 00:20:43.914 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:43.914 "strip_size_kb": 0, 00:20:43.914 "state": "online", 00:20:43.914 "raid_level": "raid1", 00:20:43.914 "superblock": true, 00:20:43.914 "num_base_bdevs": 2, 00:20:43.914 "num_base_bdevs_discovered": 2, 00:20:43.914 "num_base_bdevs_operational": 2, 00:20:43.914 "base_bdevs_list": [ 00:20:43.914 { 00:20:43.914 "name": "spare", 00:20:43.914 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:43.914 "is_configured": true, 00:20:43.914 "data_offset": 2048, 00:20:43.915 "data_size": 63488 00:20:43.915 }, 00:20:43.915 { 00:20:43.915 "name": "BaseBdev2", 00:20:43.915 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:43.915 "is_configured": true, 00:20:43.915 "data_offset": 2048, 00:20:43.915 "data_size": 63488 00:20:43.915 } 00:20:43.915 ] 00:20:43.915 }' 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.915 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.175 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:44.175 "name": "raid_bdev1", 00:20:44.175 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:44.175 "strip_size_kb": 0, 00:20:44.175 "state": "online", 00:20:44.175 "raid_level": "raid1", 00:20:44.175 "superblock": true, 00:20:44.175 "num_base_bdevs": 2, 00:20:44.175 "num_base_bdevs_discovered": 2, 00:20:44.175 "num_base_bdevs_operational": 2, 00:20:44.175 "base_bdevs_list": [ 00:20:44.175 { 00:20:44.175 "name": "spare", 00:20:44.175 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:44.175 "is_configured": true, 00:20:44.175 "data_offset": 2048, 00:20:44.175 "data_size": 63488 00:20:44.175 }, 00:20:44.175 { 00:20:44.175 "name": "BaseBdev2", 00:20:44.175 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:44.175 "is_configured": true, 00:20:44.175 "data_offset": 2048, 00:20:44.175 "data_size": 63488 00:20:44.175 } 00:20:44.175 ] 00:20:44.175 }' 00:20:44.175 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:44.175 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:44.175 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.436 "name": "raid_bdev1", 00:20:44.436 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:44.436 "strip_size_kb": 0, 00:20:44.436 "state": "online", 00:20:44.436 "raid_level": "raid1", 00:20:44.436 "superblock": true, 00:20:44.436 "num_base_bdevs": 2, 00:20:44.436 "num_base_bdevs_discovered": 2, 00:20:44.436 "num_base_bdevs_operational": 2, 00:20:44.436 "base_bdevs_list": [ 00:20:44.436 { 00:20:44.436 "name": "spare", 00:20:44.436 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:44.436 "is_configured": true, 00:20:44.436 "data_offset": 2048, 00:20:44.436 "data_size": 63488 00:20:44.436 }, 00:20:44.436 { 00:20:44.436 "name": "BaseBdev2", 00:20:44.436 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:44.436 "is_configured": true, 00:20:44.436 "data_offset": 2048, 00:20:44.436 "data_size": 63488 00:20:44.436 } 00:20:44.436 ] 00:20:44.436 }' 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.436 13:48:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:45.006 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:45.266 [2024-06-10 13:48:59.576530] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:45.266 [2024-06-10 13:48:59.576552] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:45.266 00:20:45.266 Latency(us) 00:20:45.266 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:45.266 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:45.266 raid_bdev1 : 10.82 106.61 319.82 0.00 0.00 12374.56 261.12 115343.36 00:20:45.266 =================================================================================================================== 00:20:45.266 Total : 106.61 319.82 0.00 0.00 12374.56 261.12 115343.36 00:20:45.266 [2024-06-10 13:48:59.652216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:45.266 [2024-06-10 13:48:59.652242] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:45.266 [2024-06-10 13:48:59.652308] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:45.266 [2024-06-10 13:48:59.652314] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1786d20 name raid_bdev1, state offline 00:20:45.266 0 00:20:45.266 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.266 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:45.526 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:45.526 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:45.527 13:48:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:45.787 /dev/nbd0 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:45.787 1+0 records in 00:20:45.787 1+0 records out 00:20:45.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292145 s, 14.0 MB/s 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:45.787 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:20:46.047 /dev/nbd1 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:46.047 1+0 records in 00:20:46.047 1+0 records out 00:20:46.047 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271285 s, 15.1 MB/s 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:46.047 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:46.307 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:46.307 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:46.307 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:46.308 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:46.567 13:49:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:46.827 [2024-06-10 13:49:01.278825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:46.827 [2024-06-10 13:49:01.278861] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.827 [2024-06-10 13:49:01.278876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1784b20 00:20:46.827 [2024-06-10 13:49:01.278883] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.827 [2024-06-10 13:49:01.280279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.827 [2024-06-10 13:49:01.280302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:46.827 [2024-06-10 13:49:01.280364] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:46.827 [2024-06-10 13:49:01.280385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:46.827 [2024-06-10 13:49:01.280467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:46.827 spare 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.827 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.087 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.087 [2024-06-10 13:49:01.380760] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1788070 00:20:47.087 [2024-06-10 13:49:01.380770] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:47.087 [2024-06-10 13:49:01.380940] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1783c60 00:20:47.087 [2024-06-10 13:49:01.381061] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1788070 00:20:47.087 [2024-06-10 13:49:01.381068] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1788070 00:20:47.087 [2024-06-10 13:49:01.381155] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:47.087 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.087 "name": "raid_bdev1", 00:20:47.087 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:47.087 "strip_size_kb": 0, 00:20:47.087 "state": "online", 00:20:47.087 "raid_level": "raid1", 00:20:47.087 "superblock": true, 00:20:47.087 "num_base_bdevs": 2, 00:20:47.087 "num_base_bdevs_discovered": 2, 00:20:47.087 "num_base_bdevs_operational": 2, 00:20:47.087 "base_bdevs_list": [ 00:20:47.087 { 00:20:47.087 "name": "spare", 00:20:47.087 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:47.087 "is_configured": true, 00:20:47.087 "data_offset": 2048, 00:20:47.087 "data_size": 63488 00:20:47.087 }, 00:20:47.087 { 00:20:47.087 "name": "BaseBdev2", 00:20:47.087 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:47.087 "is_configured": true, 00:20:47.087 "data_offset": 2048, 00:20:47.087 "data_size": 63488 00:20:47.087 } 00:20:47.087 ] 00:20:47.087 }' 00:20:47.087 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.087 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.655 13:49:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:47.915 "name": "raid_bdev1", 00:20:47.915 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:47.915 "strip_size_kb": 0, 00:20:47.915 "state": "online", 00:20:47.915 "raid_level": "raid1", 00:20:47.915 "superblock": true, 00:20:47.915 "num_base_bdevs": 2, 00:20:47.915 "num_base_bdevs_discovered": 2, 00:20:47.915 "num_base_bdevs_operational": 2, 00:20:47.915 "base_bdevs_list": [ 00:20:47.915 { 00:20:47.915 "name": "spare", 00:20:47.915 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:47.915 "is_configured": true, 00:20:47.915 "data_offset": 2048, 00:20:47.915 "data_size": 63488 00:20:47.915 }, 00:20:47.915 { 00:20:47.915 "name": "BaseBdev2", 00:20:47.915 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:47.915 "is_configured": true, 00:20:47.915 "data_offset": 2048, 00:20:47.915 "data_size": 63488 00:20:47.915 } 00:20:47.915 ] 00:20:47.915 }' 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.915 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:48.175 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:48.175 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:48.435 [2024-06-10 13:49:02.658553] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.435 "name": "raid_bdev1", 00:20:48.435 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:48.435 "strip_size_kb": 0, 00:20:48.435 "state": "online", 00:20:48.435 "raid_level": "raid1", 00:20:48.435 "superblock": true, 00:20:48.435 "num_base_bdevs": 2, 00:20:48.435 "num_base_bdevs_discovered": 1, 00:20:48.435 "num_base_bdevs_operational": 1, 00:20:48.435 "base_bdevs_list": [ 00:20:48.435 { 00:20:48.435 "name": null, 00:20:48.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.435 "is_configured": false, 00:20:48.435 "data_offset": 2048, 00:20:48.435 "data_size": 63488 00:20:48.435 }, 00:20:48.435 { 00:20:48.435 "name": "BaseBdev2", 00:20:48.435 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:48.435 "is_configured": true, 00:20:48.435 "data_offset": 2048, 00:20:48.435 "data_size": 63488 00:20:48.435 } 00:20:48.435 ] 00:20:48.435 }' 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.435 13:49:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:49.005 13:49:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:49.265 [2024-06-10 13:49:03.629145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:49.265 [2024-06-10 13:49:03.629274] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:49.265 [2024-06-10 13:49:03.629285] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:49.265 [2024-06-10 13:49:03.629308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:49.265 [2024-06-10 13:49:03.633005] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1785940 00:20:49.265 [2024-06-10 13:49:03.634723] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:49.265 13:49:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.205 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.465 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:50.465 "name": "raid_bdev1", 00:20:50.465 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:50.465 "strip_size_kb": 0, 00:20:50.465 "state": "online", 00:20:50.465 "raid_level": "raid1", 00:20:50.465 "superblock": true, 00:20:50.465 "num_base_bdevs": 2, 00:20:50.465 "num_base_bdevs_discovered": 2, 00:20:50.465 "num_base_bdevs_operational": 2, 00:20:50.465 "process": { 00:20:50.465 "type": "rebuild", 00:20:50.465 "target": "spare", 00:20:50.465 "progress": { 00:20:50.465 "blocks": 22528, 00:20:50.465 "percent": 35 00:20:50.465 } 00:20:50.465 }, 00:20:50.465 "base_bdevs_list": [ 00:20:50.465 { 00:20:50.465 "name": "spare", 00:20:50.465 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:50.465 "is_configured": true, 00:20:50.465 "data_offset": 2048, 00:20:50.465 "data_size": 63488 00:20:50.465 }, 00:20:50.465 { 00:20:50.465 "name": "BaseBdev2", 00:20:50.465 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:50.465 "is_configured": true, 00:20:50.465 "data_offset": 2048, 00:20:50.465 "data_size": 63488 00:20:50.465 } 00:20:50.465 ] 00:20:50.465 }' 00:20:50.465 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:50.465 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:50.465 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:50.465 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:50.465 13:49:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:50.726 [2024-06-10 13:49:05.115786] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:50.726 [2024-06-10 13:49:05.143799] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:50.726 [2024-06-10 13:49:05.143838] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:50.726 [2024-06-10 13:49:05.143849] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:50.726 [2024-06-10 13:49:05.143854] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.726 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.986 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.986 "name": "raid_bdev1", 00:20:50.986 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:50.986 "strip_size_kb": 0, 00:20:50.986 "state": "online", 00:20:50.986 "raid_level": "raid1", 00:20:50.986 "superblock": true, 00:20:50.986 "num_base_bdevs": 2, 00:20:50.986 "num_base_bdevs_discovered": 1, 00:20:50.986 "num_base_bdevs_operational": 1, 00:20:50.986 "base_bdevs_list": [ 00:20:50.986 { 00:20:50.986 "name": null, 00:20:50.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.986 "is_configured": false, 00:20:50.986 "data_offset": 2048, 00:20:50.986 "data_size": 63488 00:20:50.986 }, 00:20:50.986 { 00:20:50.986 "name": "BaseBdev2", 00:20:50.986 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:50.986 "is_configured": true, 00:20:50.986 "data_offset": 2048, 00:20:50.986 "data_size": 63488 00:20:50.986 } 00:20:50.986 ] 00:20:50.986 }' 00:20:50.986 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.986 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:51.558 13:49:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:51.819 [2024-06-10 13:49:06.102478] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:51.819 [2024-06-10 13:49:06.102515] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:51.819 [2024-06-10 13:49:06.102533] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1781410 00:20:51.819 [2024-06-10 13:49:06.102540] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:51.819 [2024-06-10 13:49:06.102869] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:51.819 [2024-06-10 13:49:06.102882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:51.819 [2024-06-10 13:49:06.102947] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:51.819 [2024-06-10 13:49:06.102954] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:20:51.819 [2024-06-10 13:49:06.102960] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:51.819 [2024-06-10 13:49:06.102971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:51.819 [2024-06-10 13:49:06.106703] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ed5a0 00:20:51.819 spare 00:20:51.819 [2024-06-10 13:49:06.107912] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:51.819 13:49:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.758 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.018 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:53.019 "name": "raid_bdev1", 00:20:53.019 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:53.019 "strip_size_kb": 0, 00:20:53.019 "state": "online", 00:20:53.019 "raid_level": "raid1", 00:20:53.019 "superblock": true, 00:20:53.019 "num_base_bdevs": 2, 00:20:53.019 "num_base_bdevs_discovered": 2, 00:20:53.019 "num_base_bdevs_operational": 2, 00:20:53.019 "process": { 00:20:53.019 "type": "rebuild", 00:20:53.019 "target": "spare", 00:20:53.019 "progress": { 00:20:53.019 "blocks": 22528, 00:20:53.019 "percent": 35 00:20:53.019 } 00:20:53.019 }, 00:20:53.019 "base_bdevs_list": [ 00:20:53.019 { 00:20:53.019 "name": "spare", 00:20:53.019 "uuid": "f9fd5d71-ac98-5d82-8ec0-04e4ddc227a8", 00:20:53.019 "is_configured": true, 00:20:53.019 "data_offset": 2048, 00:20:53.019 "data_size": 63488 00:20:53.019 }, 00:20:53.019 { 00:20:53.019 "name": "BaseBdev2", 00:20:53.019 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:53.019 "is_configured": true, 00:20:53.019 "data_offset": 2048, 00:20:53.019 "data_size": 63488 00:20:53.019 } 00:20:53.019 ] 00:20:53.019 }' 00:20:53.019 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:53.019 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:53.019 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:53.019 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:53.019 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:53.279 [2024-06-10 13:49:07.604421] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:53.279 [2024-06-10 13:49:07.617000] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:53.279 [2024-06-10 13:49:07.617032] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:53.279 [2024-06-10 13:49:07.617042] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:53.279 [2024-06-10 13:49:07.617046] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.279 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.539 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.539 "name": "raid_bdev1", 00:20:53.539 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:53.539 "strip_size_kb": 0, 00:20:53.539 "state": "online", 00:20:53.539 "raid_level": "raid1", 00:20:53.539 "superblock": true, 00:20:53.539 "num_base_bdevs": 2, 00:20:53.539 "num_base_bdevs_discovered": 1, 00:20:53.539 "num_base_bdevs_operational": 1, 00:20:53.539 "base_bdevs_list": [ 00:20:53.539 { 00:20:53.539 "name": null, 00:20:53.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.539 "is_configured": false, 00:20:53.539 "data_offset": 2048, 00:20:53.539 "data_size": 63488 00:20:53.539 }, 00:20:53.539 { 00:20:53.539 "name": "BaseBdev2", 00:20:53.539 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:53.539 "is_configured": true, 00:20:53.539 "data_offset": 2048, 00:20:53.539 "data_size": 63488 00:20:53.539 } 00:20:53.539 ] 00:20:53.539 }' 00:20:53.539 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.539 13:49:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.110 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:54.371 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:54.371 "name": "raid_bdev1", 00:20:54.371 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:54.371 "strip_size_kb": 0, 00:20:54.371 "state": "online", 00:20:54.371 "raid_level": "raid1", 00:20:54.371 "superblock": true, 00:20:54.371 "num_base_bdevs": 2, 00:20:54.371 "num_base_bdevs_discovered": 1, 00:20:54.371 "num_base_bdevs_operational": 1, 00:20:54.371 "base_bdevs_list": [ 00:20:54.371 { 00:20:54.371 "name": null, 00:20:54.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.371 "is_configured": false, 00:20:54.371 "data_offset": 2048, 00:20:54.371 "data_size": 63488 00:20:54.371 }, 00:20:54.371 { 00:20:54.371 "name": "BaseBdev2", 00:20:54.371 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:54.371 "is_configured": true, 00:20:54.371 "data_offset": 2048, 00:20:54.371 "data_size": 63488 00:20:54.371 } 00:20:54.371 ] 00:20:54.371 }' 00:20:54.371 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:54.371 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:54.371 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:54.371 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:54.371 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:54.632 13:49:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:54.632 [2024-06-10 13:49:09.084907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:54.632 [2024-06-10 13:49:09.084941] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:54.632 [2024-06-10 13:49:09.084955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1780760 00:20:54.632 [2024-06-10 13:49:09.084962] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:54.632 [2024-06-10 13:49:09.085270] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:54.632 [2024-06-10 13:49:09.085283] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:54.632 [2024-06-10 13:49:09.085333] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:54.632 [2024-06-10 13:49:09.085341] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:54.632 [2024-06-10 13:49:09.085347] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:54.632 BaseBdev1 00:20:54.632 13:49:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.019 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.020 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.020 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.020 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.020 "name": "raid_bdev1", 00:20:56.020 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:56.020 "strip_size_kb": 0, 00:20:56.020 "state": "online", 00:20:56.020 "raid_level": "raid1", 00:20:56.020 "superblock": true, 00:20:56.020 "num_base_bdevs": 2, 00:20:56.020 "num_base_bdevs_discovered": 1, 00:20:56.020 "num_base_bdevs_operational": 1, 00:20:56.020 "base_bdevs_list": [ 00:20:56.020 { 00:20:56.020 "name": null, 00:20:56.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.020 "is_configured": false, 00:20:56.020 "data_offset": 2048, 00:20:56.020 "data_size": 63488 00:20:56.020 }, 00:20:56.020 { 00:20:56.020 "name": "BaseBdev2", 00:20:56.020 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:56.020 "is_configured": true, 00:20:56.020 "data_offset": 2048, 00:20:56.020 "data_size": 63488 00:20:56.020 } 00:20:56.020 ] 00:20:56.020 }' 00:20:56.020 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.020 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.589 13:49:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.589 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:56.589 "name": "raid_bdev1", 00:20:56.589 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:56.589 "strip_size_kb": 0, 00:20:56.589 "state": "online", 00:20:56.589 "raid_level": "raid1", 00:20:56.589 "superblock": true, 00:20:56.589 "num_base_bdevs": 2, 00:20:56.589 "num_base_bdevs_discovered": 1, 00:20:56.589 "num_base_bdevs_operational": 1, 00:20:56.589 "base_bdevs_list": [ 00:20:56.589 { 00:20:56.589 "name": null, 00:20:56.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.589 "is_configured": false, 00:20:56.589 "data_offset": 2048, 00:20:56.589 "data_size": 63488 00:20:56.589 }, 00:20:56.589 { 00:20:56.589 "name": "BaseBdev2", 00:20:56.589 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:56.589 "is_configured": true, 00:20:56.589 "data_offset": 2048, 00:20:56.589 "data_size": 63488 00:20:56.589 } 00:20:56.589 ] 00:20:56.589 }' 00:20:56.589 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:56.849 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:56.849 [2024-06-10 13:49:11.322883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:56.849 [2024-06-10 13:49:11.322992] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:20:56.849 [2024-06-10 13:49:11.323001] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:57.108 request: 00:20:57.108 { 00:20:57.108 "raid_bdev": "raid_bdev1", 00:20:57.108 "base_bdev": "BaseBdev1", 00:20:57.108 "method": "bdev_raid_add_base_bdev", 00:20:57.108 "req_id": 1 00:20:57.108 } 00:20:57.108 Got JSON-RPC error response 00:20:57.108 response: 00:20:57.108 { 00:20:57.108 "code": -22, 00:20:57.108 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:57.108 } 00:20:57.108 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:20:57.108 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:20:57.108 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:20:57.108 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:20:57.108 13:49:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.048 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.307 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.307 "name": "raid_bdev1", 00:20:58.307 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:58.307 "strip_size_kb": 0, 00:20:58.307 "state": "online", 00:20:58.307 "raid_level": "raid1", 00:20:58.307 "superblock": true, 00:20:58.307 "num_base_bdevs": 2, 00:20:58.307 "num_base_bdevs_discovered": 1, 00:20:58.307 "num_base_bdevs_operational": 1, 00:20:58.307 "base_bdevs_list": [ 00:20:58.307 { 00:20:58.307 "name": null, 00:20:58.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.307 "is_configured": false, 00:20:58.307 "data_offset": 2048, 00:20:58.307 "data_size": 63488 00:20:58.307 }, 00:20:58.307 { 00:20:58.307 "name": "BaseBdev2", 00:20:58.307 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:58.307 "is_configured": true, 00:20:58.307 "data_offset": 2048, 00:20:58.307 "data_size": 63488 00:20:58.307 } 00:20:58.307 ] 00:20:58.307 }' 00:20:58.307 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.307 13:49:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:58.876 "name": "raid_bdev1", 00:20:58.876 "uuid": "7cec7a79-3c30-4de0-b9e8-d433e41f5f71", 00:20:58.876 "strip_size_kb": 0, 00:20:58.876 "state": "online", 00:20:58.876 "raid_level": "raid1", 00:20:58.876 "superblock": true, 00:20:58.876 "num_base_bdevs": 2, 00:20:58.876 "num_base_bdevs_discovered": 1, 00:20:58.876 "num_base_bdevs_operational": 1, 00:20:58.876 "base_bdevs_list": [ 00:20:58.876 { 00:20:58.876 "name": null, 00:20:58.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.876 "is_configured": false, 00:20:58.876 "data_offset": 2048, 00:20:58.876 "data_size": 63488 00:20:58.876 }, 00:20:58.876 { 00:20:58.876 "name": "BaseBdev2", 00:20:58.876 "uuid": "5530ee0d-da36-5256-9111-9dc83c9cd7ae", 00:20:58.876 "is_configured": true, 00:20:58.876 "data_offset": 2048, 00:20:58.876 "data_size": 63488 00:20:58.876 } 00:20:58.876 ] 00:20:58.876 }' 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:58.876 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1634491 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 1634491 ']' 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 1634491 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1634491 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1634491' 00:20:59.136 killing process with pid 1634491 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 1634491 00:20:59.136 Received shutdown signal, test time was about 24.592900 seconds 00:20:59.136 00:20:59.136 Latency(us) 00:20:59.136 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:59.136 =================================================================================================================== 00:20:59.136 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:59.136 [2024-06-10 13:49:13.449069] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:59.136 [2024-06-10 13:49:13.449149] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:59.136 [2024-06-10 13:49:13.449193] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:59.136 [2024-06-10 13:49:13.449201] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1788070 name raid_bdev1, state offline 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 1634491 00:20:59.136 [2024-06-10 13:49:13.461561] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:59.136 00:20:59.136 real 0m28.558s 00:20:59.136 user 0m44.905s 00:20:59.136 sys 0m3.083s 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:59.136 13:49:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:59.136 ************************************ 00:20:59.136 END TEST raid_rebuild_test_sb_io 00:20:59.136 ************************************ 00:20:59.397 13:49:13 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:20:59.397 13:49:13 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:20:59.397 13:49:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:20:59.397 13:49:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:59.397 13:49:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:59.397 ************************************ 00:20:59.397 START TEST raid_rebuild_test 00:20:59.397 ************************************ 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false false true 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1640431 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1640431 /var/tmp/spdk-raid.sock 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 1640431 ']' 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:59.397 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:59.397 13:49:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:59.397 [2024-06-10 13:49:13.731954] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:20:59.397 [2024-06-10 13:49:13.732002] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640431 ] 00:20:59.397 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:59.397 Zero copy mechanism will not be used. 00:20:59.397 [2024-06-10 13:49:13.824918] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:59.658 [2024-06-10 13:49:13.902363] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:20:59.658 [2024-06-10 13:49:13.952703] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:59.658 [2024-06-10 13:49:13.952729] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:00.227 13:49:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:00.227 13:49:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:21:00.227 13:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:00.227 13:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:00.487 BaseBdev1_malloc 00:21:00.487 13:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:00.747 [2024-06-10 13:49:14.972228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:00.747 [2024-06-10 13:49:14.972264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.747 [2024-06-10 13:49:14.972278] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f7900 00:21:00.747 [2024-06-10 13:49:14.972285] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.747 [2024-06-10 13:49:14.973671] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.747 [2024-06-10 13:49:14.973691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:00.747 BaseBdev1 00:21:00.747 13:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:00.747 13:49:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:00.747 BaseBdev2_malloc 00:21:00.747 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:01.007 [2024-06-10 13:49:15.379491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:01.007 [2024-06-10 13:49:15.379523] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.007 [2024-06-10 13:49:15.379535] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f89c0 00:21:01.007 [2024-06-10 13:49:15.379542] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.007 [2024-06-10 13:49:15.380823] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.007 [2024-06-10 13:49:15.380843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:01.007 BaseBdev2 00:21:01.007 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:01.007 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:01.267 BaseBdev3_malloc 00:21:01.267 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:01.527 [2024-06-10 13:49:15.782587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:01.527 [2024-06-10 13:49:15.782616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.527 [2024-06-10 13:49:15.782628] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28a2e80 00:21:01.527 [2024-06-10 13:49:15.782635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.527 [2024-06-10 13:49:15.783906] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.527 [2024-06-10 13:49:15.783924] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:01.527 BaseBdev3 00:21:01.527 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:01.527 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:01.527 BaseBdev4_malloc 00:21:01.527 13:49:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:01.788 [2024-06-10 13:49:16.173596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:01.788 [2024-06-10 13:49:16.173624] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.788 [2024-06-10 13:49:16.173637] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28a5b20 00:21:01.788 [2024-06-10 13:49:16.173643] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.788 [2024-06-10 13:49:16.174903] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.788 [2024-06-10 13:49:16.174921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:01.788 BaseBdev4 00:21:01.788 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:02.048 spare_malloc 00:21:02.048 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:02.309 spare_delay 00:21:02.309 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:02.309 [2024-06-10 13:49:16.777173] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:02.309 [2024-06-10 13:49:16.777205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:02.309 [2024-06-10 13:49:16.777224] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28a8730 00:21:02.309 [2024-06-10 13:49:16.777231] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:02.309 [2024-06-10 13:49:16.778508] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:02.309 [2024-06-10 13:49:16.778528] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:02.309 spare 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:02.570 [2024-06-10 13:49:16.965661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:02.570 [2024-06-10 13:49:16.966726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:02.570 [2024-06-10 13:49:16.966772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:02.570 [2024-06-10 13:49:16.966808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:02.570 [2024-06-10 13:49:16.966873] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28a7330 00:21:02.570 [2024-06-10 13:49:16.966879] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:02.570 [2024-06-10 13:49:16.967050] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28a7c70 00:21:02.570 [2024-06-10 13:49:16.967177] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28a7330 00:21:02.570 [2024-06-10 13:49:16.967184] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28a7330 00:21:02.570 [2024-06-10 13:49:16.967270] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.570 13:49:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.830 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.830 "name": "raid_bdev1", 00:21:02.830 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:02.830 "strip_size_kb": 0, 00:21:02.830 "state": "online", 00:21:02.830 "raid_level": "raid1", 00:21:02.830 "superblock": false, 00:21:02.830 "num_base_bdevs": 4, 00:21:02.830 "num_base_bdevs_discovered": 4, 00:21:02.830 "num_base_bdevs_operational": 4, 00:21:02.830 "base_bdevs_list": [ 00:21:02.830 { 00:21:02.830 "name": "BaseBdev1", 00:21:02.830 "uuid": "d6927abf-e368-5993-adf0-327302ceb7fd", 00:21:02.830 "is_configured": true, 00:21:02.830 "data_offset": 0, 00:21:02.830 "data_size": 65536 00:21:02.830 }, 00:21:02.830 { 00:21:02.830 "name": "BaseBdev2", 00:21:02.830 "uuid": "057f2d4b-471f-5249-9b2f-d74c6356258d", 00:21:02.830 "is_configured": true, 00:21:02.831 "data_offset": 0, 00:21:02.831 "data_size": 65536 00:21:02.831 }, 00:21:02.831 { 00:21:02.831 "name": "BaseBdev3", 00:21:02.831 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:02.831 "is_configured": true, 00:21:02.831 "data_offset": 0, 00:21:02.831 "data_size": 65536 00:21:02.831 }, 00:21:02.831 { 00:21:02.831 "name": "BaseBdev4", 00:21:02.831 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:02.831 "is_configured": true, 00:21:02.831 "data_offset": 0, 00:21:02.831 "data_size": 65536 00:21:02.831 } 00:21:02.831 ] 00:21:02.831 }' 00:21:02.831 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.831 13:49:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:03.401 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:03.401 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:03.661 [2024-06-10 13:49:17.904264] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:03.661 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:03.661 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.661 13:49:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:03.661 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:03.921 [2024-06-10 13:49:18.297079] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28a7c70 00:21:03.921 /dev/nbd0 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:03.921 1+0 records in 00:21:03.921 1+0 records out 00:21:03.921 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291031 s, 14.1 MB/s 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:03.921 13:49:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:21:13.994 65536+0 records in 00:21:13.994 65536+0 records out 00:21:13.994 33554432 bytes (34 MB, 32 MiB) copied, 9.26269 s, 3.6 MB/s 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:13.994 [2024-06-10 13:49:27.789714] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:13.994 [2024-06-10 13:49:27.978227] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.994 13:49:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.994 13:49:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.994 "name": "raid_bdev1", 00:21:13.994 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:13.994 "strip_size_kb": 0, 00:21:13.994 "state": "online", 00:21:13.994 "raid_level": "raid1", 00:21:13.994 "superblock": false, 00:21:13.994 "num_base_bdevs": 4, 00:21:13.994 "num_base_bdevs_discovered": 3, 00:21:13.994 "num_base_bdevs_operational": 3, 00:21:13.994 "base_bdevs_list": [ 00:21:13.994 { 00:21:13.994 "name": null, 00:21:13.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.994 "is_configured": false, 00:21:13.994 "data_offset": 0, 00:21:13.994 "data_size": 65536 00:21:13.994 }, 00:21:13.994 { 00:21:13.994 "name": "BaseBdev2", 00:21:13.994 "uuid": "057f2d4b-471f-5249-9b2f-d74c6356258d", 00:21:13.994 "is_configured": true, 00:21:13.994 "data_offset": 0, 00:21:13.994 "data_size": 65536 00:21:13.994 }, 00:21:13.994 { 00:21:13.994 "name": "BaseBdev3", 00:21:13.994 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:13.994 "is_configured": true, 00:21:13.994 "data_offset": 0, 00:21:13.994 "data_size": 65536 00:21:13.994 }, 00:21:13.994 { 00:21:13.994 "name": "BaseBdev4", 00:21:13.994 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:13.994 "is_configured": true, 00:21:13.994 "data_offset": 0, 00:21:13.994 "data_size": 65536 00:21:13.994 } 00:21:13.994 ] 00:21:13.994 }' 00:21:13.994 13:49:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.994 13:49:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.254 13:49:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:14.517 [2024-06-10 13:49:28.908586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:14.517 [2024-06-10 13:49:28.911441] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x282cb10 00:21:14.517 [2024-06-10 13:49:28.913187] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:14.517 13:49:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:15.455 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:15.455 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:15.455 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:15.455 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:15.455 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:15.715 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.715 13:49:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.715 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:15.715 "name": "raid_bdev1", 00:21:15.715 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:15.715 "strip_size_kb": 0, 00:21:15.715 "state": "online", 00:21:15.715 "raid_level": "raid1", 00:21:15.715 "superblock": false, 00:21:15.715 "num_base_bdevs": 4, 00:21:15.715 "num_base_bdevs_discovered": 4, 00:21:15.715 "num_base_bdevs_operational": 4, 00:21:15.715 "process": { 00:21:15.715 "type": "rebuild", 00:21:15.715 "target": "spare", 00:21:15.715 "progress": { 00:21:15.715 "blocks": 24576, 00:21:15.715 "percent": 37 00:21:15.715 } 00:21:15.715 }, 00:21:15.715 "base_bdevs_list": [ 00:21:15.715 { 00:21:15.715 "name": "spare", 00:21:15.715 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:15.715 "is_configured": true, 00:21:15.715 "data_offset": 0, 00:21:15.715 "data_size": 65536 00:21:15.715 }, 00:21:15.715 { 00:21:15.715 "name": "BaseBdev2", 00:21:15.715 "uuid": "057f2d4b-471f-5249-9b2f-d74c6356258d", 00:21:15.715 "is_configured": true, 00:21:15.715 "data_offset": 0, 00:21:15.715 "data_size": 65536 00:21:15.715 }, 00:21:15.715 { 00:21:15.715 "name": "BaseBdev3", 00:21:15.715 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:15.715 "is_configured": true, 00:21:15.715 "data_offset": 0, 00:21:15.715 "data_size": 65536 00:21:15.715 }, 00:21:15.715 { 00:21:15.715 "name": "BaseBdev4", 00:21:15.715 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:15.715 "is_configured": true, 00:21:15.715 "data_offset": 0, 00:21:15.715 "data_size": 65536 00:21:15.715 } 00:21:15.715 ] 00:21:15.715 }' 00:21:15.715 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:15.715 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:15.975 [2024-06-10 13:49:30.414195] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:15.975 [2024-06-10 13:49:30.422208] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:15.975 [2024-06-10 13:49:30.422242] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:15.975 [2024-06-10 13:49:30.422254] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:15.975 [2024-06-10 13:49:30.422259] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.975 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.235 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.235 "name": "raid_bdev1", 00:21:16.235 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:16.235 "strip_size_kb": 0, 00:21:16.235 "state": "online", 00:21:16.235 "raid_level": "raid1", 00:21:16.235 "superblock": false, 00:21:16.235 "num_base_bdevs": 4, 00:21:16.235 "num_base_bdevs_discovered": 3, 00:21:16.235 "num_base_bdevs_operational": 3, 00:21:16.235 "base_bdevs_list": [ 00:21:16.235 { 00:21:16.235 "name": null, 00:21:16.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.235 "is_configured": false, 00:21:16.235 "data_offset": 0, 00:21:16.235 "data_size": 65536 00:21:16.235 }, 00:21:16.235 { 00:21:16.235 "name": "BaseBdev2", 00:21:16.235 "uuid": "057f2d4b-471f-5249-9b2f-d74c6356258d", 00:21:16.235 "is_configured": true, 00:21:16.235 "data_offset": 0, 00:21:16.235 "data_size": 65536 00:21:16.235 }, 00:21:16.235 { 00:21:16.235 "name": "BaseBdev3", 00:21:16.235 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:16.235 "is_configured": true, 00:21:16.235 "data_offset": 0, 00:21:16.235 "data_size": 65536 00:21:16.235 }, 00:21:16.235 { 00:21:16.235 "name": "BaseBdev4", 00:21:16.235 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:16.235 "is_configured": true, 00:21:16.235 "data_offset": 0, 00:21:16.235 "data_size": 65536 00:21:16.235 } 00:21:16.235 ] 00:21:16.235 }' 00:21:16.235 13:49:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.235 13:49:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.804 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.064 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:17.064 "name": "raid_bdev1", 00:21:17.064 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:17.064 "strip_size_kb": 0, 00:21:17.064 "state": "online", 00:21:17.064 "raid_level": "raid1", 00:21:17.064 "superblock": false, 00:21:17.064 "num_base_bdevs": 4, 00:21:17.064 "num_base_bdevs_discovered": 3, 00:21:17.064 "num_base_bdevs_operational": 3, 00:21:17.064 "base_bdevs_list": [ 00:21:17.064 { 00:21:17.064 "name": null, 00:21:17.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.064 "is_configured": false, 00:21:17.064 "data_offset": 0, 00:21:17.064 "data_size": 65536 00:21:17.064 }, 00:21:17.064 { 00:21:17.064 "name": "BaseBdev2", 00:21:17.064 "uuid": "057f2d4b-471f-5249-9b2f-d74c6356258d", 00:21:17.064 "is_configured": true, 00:21:17.064 "data_offset": 0, 00:21:17.064 "data_size": 65536 00:21:17.064 }, 00:21:17.064 { 00:21:17.064 "name": "BaseBdev3", 00:21:17.064 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:17.064 "is_configured": true, 00:21:17.064 "data_offset": 0, 00:21:17.064 "data_size": 65536 00:21:17.064 }, 00:21:17.064 { 00:21:17.064 "name": "BaseBdev4", 00:21:17.064 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:17.064 "is_configured": true, 00:21:17.064 "data_offset": 0, 00:21:17.064 "data_size": 65536 00:21:17.064 } 00:21:17.064 ] 00:21:17.064 }' 00:21:17.064 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:17.064 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:17.064 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:17.064 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:17.064 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:17.324 [2024-06-10 13:49:31.705589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:17.324 [2024-06-10 13:49:31.708521] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x282ec70 00:21:17.324 [2024-06-10 13:49:31.709770] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:17.324 13:49:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.262 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.521 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:18.521 "name": "raid_bdev1", 00:21:18.521 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:18.521 "strip_size_kb": 0, 00:21:18.521 "state": "online", 00:21:18.521 "raid_level": "raid1", 00:21:18.521 "superblock": false, 00:21:18.521 "num_base_bdevs": 4, 00:21:18.521 "num_base_bdevs_discovered": 4, 00:21:18.521 "num_base_bdevs_operational": 4, 00:21:18.521 "process": { 00:21:18.521 "type": "rebuild", 00:21:18.521 "target": "spare", 00:21:18.521 "progress": { 00:21:18.521 "blocks": 24576, 00:21:18.521 "percent": 37 00:21:18.521 } 00:21:18.521 }, 00:21:18.521 "base_bdevs_list": [ 00:21:18.521 { 00:21:18.521 "name": "spare", 00:21:18.521 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:18.521 "is_configured": true, 00:21:18.521 "data_offset": 0, 00:21:18.521 "data_size": 65536 00:21:18.521 }, 00:21:18.521 { 00:21:18.521 "name": "BaseBdev2", 00:21:18.521 "uuid": "057f2d4b-471f-5249-9b2f-d74c6356258d", 00:21:18.522 "is_configured": true, 00:21:18.522 "data_offset": 0, 00:21:18.522 "data_size": 65536 00:21:18.522 }, 00:21:18.522 { 00:21:18.522 "name": "BaseBdev3", 00:21:18.522 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:18.522 "is_configured": true, 00:21:18.522 "data_offset": 0, 00:21:18.522 "data_size": 65536 00:21:18.522 }, 00:21:18.522 { 00:21:18.522 "name": "BaseBdev4", 00:21:18.522 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:18.522 "is_configured": true, 00:21:18.522 "data_offset": 0, 00:21:18.522 "data_size": 65536 00:21:18.522 } 00:21:18.522 ] 00:21:18.522 }' 00:21:18.522 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:18.522 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.522 13:49:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:18.781 [2024-06-10 13:49:33.210186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:18.781 [2024-06-10 13:49:33.219005] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x282ec70 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.781 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.041 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:19.041 "name": "raid_bdev1", 00:21:19.041 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:19.041 "strip_size_kb": 0, 00:21:19.041 "state": "online", 00:21:19.041 "raid_level": "raid1", 00:21:19.041 "superblock": false, 00:21:19.041 "num_base_bdevs": 4, 00:21:19.041 "num_base_bdevs_discovered": 3, 00:21:19.041 "num_base_bdevs_operational": 3, 00:21:19.041 "process": { 00:21:19.041 "type": "rebuild", 00:21:19.041 "target": "spare", 00:21:19.041 "progress": { 00:21:19.041 "blocks": 34816, 00:21:19.041 "percent": 53 00:21:19.041 } 00:21:19.041 }, 00:21:19.042 "base_bdevs_list": [ 00:21:19.042 { 00:21:19.042 "name": "spare", 00:21:19.042 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:19.042 "is_configured": true, 00:21:19.042 "data_offset": 0, 00:21:19.042 "data_size": 65536 00:21:19.042 }, 00:21:19.042 { 00:21:19.042 "name": null, 00:21:19.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.042 "is_configured": false, 00:21:19.042 "data_offset": 0, 00:21:19.042 "data_size": 65536 00:21:19.042 }, 00:21:19.042 { 00:21:19.042 "name": "BaseBdev3", 00:21:19.042 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:19.042 "is_configured": true, 00:21:19.042 "data_offset": 0, 00:21:19.042 "data_size": 65536 00:21:19.042 }, 00:21:19.042 { 00:21:19.042 "name": "BaseBdev4", 00:21:19.042 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:19.042 "is_configured": true, 00:21:19.042 "data_offset": 0, 00:21:19.042 "data_size": 65536 00:21:19.042 } 00:21:19.042 ] 00:21:19.042 }' 00:21:19.042 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:19.042 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:19.042 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=761 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:19.301 "name": "raid_bdev1", 00:21:19.301 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:19.301 "strip_size_kb": 0, 00:21:19.301 "state": "online", 00:21:19.301 "raid_level": "raid1", 00:21:19.301 "superblock": false, 00:21:19.301 "num_base_bdevs": 4, 00:21:19.301 "num_base_bdevs_discovered": 3, 00:21:19.301 "num_base_bdevs_operational": 3, 00:21:19.301 "process": { 00:21:19.301 "type": "rebuild", 00:21:19.301 "target": "spare", 00:21:19.301 "progress": { 00:21:19.301 "blocks": 40960, 00:21:19.301 "percent": 62 00:21:19.301 } 00:21:19.301 }, 00:21:19.301 "base_bdevs_list": [ 00:21:19.301 { 00:21:19.301 "name": "spare", 00:21:19.301 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:19.301 "is_configured": true, 00:21:19.301 "data_offset": 0, 00:21:19.301 "data_size": 65536 00:21:19.301 }, 00:21:19.301 { 00:21:19.301 "name": null, 00:21:19.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.301 "is_configured": false, 00:21:19.301 "data_offset": 0, 00:21:19.301 "data_size": 65536 00:21:19.301 }, 00:21:19.301 { 00:21:19.301 "name": "BaseBdev3", 00:21:19.301 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:19.301 "is_configured": true, 00:21:19.301 "data_offset": 0, 00:21:19.301 "data_size": 65536 00:21:19.301 }, 00:21:19.301 { 00:21:19.301 "name": "BaseBdev4", 00:21:19.301 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:19.301 "is_configured": true, 00:21:19.301 "data_offset": 0, 00:21:19.301 "data_size": 65536 00:21:19.301 } 00:21:19.301 ] 00:21:19.301 }' 00:21:19.301 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:19.561 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:19.562 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:19.562 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:19.562 13:49:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.503 13:49:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.503 [2024-06-10 13:49:34.929139] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:20.503 [2024-06-10 13:49:34.929188] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:20.503 [2024-06-10 13:49:34.929217] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.763 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:20.763 "name": "raid_bdev1", 00:21:20.763 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:20.763 "strip_size_kb": 0, 00:21:20.763 "state": "online", 00:21:20.763 "raid_level": "raid1", 00:21:20.763 "superblock": false, 00:21:20.763 "num_base_bdevs": 4, 00:21:20.763 "num_base_bdevs_discovered": 3, 00:21:20.763 "num_base_bdevs_operational": 3, 00:21:20.763 "base_bdevs_list": [ 00:21:20.763 { 00:21:20.763 "name": "spare", 00:21:20.763 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:20.763 "is_configured": true, 00:21:20.763 "data_offset": 0, 00:21:20.763 "data_size": 65536 00:21:20.763 }, 00:21:20.763 { 00:21:20.763 "name": null, 00:21:20.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:20.763 "is_configured": false, 00:21:20.763 "data_offset": 0, 00:21:20.763 "data_size": 65536 00:21:20.763 }, 00:21:20.763 { 00:21:20.763 "name": "BaseBdev3", 00:21:20.763 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:20.763 "is_configured": true, 00:21:20.763 "data_offset": 0, 00:21:20.763 "data_size": 65536 00:21:20.763 }, 00:21:20.763 { 00:21:20.763 "name": "BaseBdev4", 00:21:20.763 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:20.763 "is_configured": true, 00:21:20.763 "data_offset": 0, 00:21:20.763 "data_size": 65536 00:21:20.763 } 00:21:20.763 ] 00:21:20.763 }' 00:21:20.763 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:20.763 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:20.763 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.764 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:21.023 "name": "raid_bdev1", 00:21:21.023 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:21.023 "strip_size_kb": 0, 00:21:21.023 "state": "online", 00:21:21.023 "raid_level": "raid1", 00:21:21.023 "superblock": false, 00:21:21.023 "num_base_bdevs": 4, 00:21:21.023 "num_base_bdevs_discovered": 3, 00:21:21.023 "num_base_bdevs_operational": 3, 00:21:21.023 "base_bdevs_list": [ 00:21:21.023 { 00:21:21.023 "name": "spare", 00:21:21.023 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:21.023 "is_configured": true, 00:21:21.023 "data_offset": 0, 00:21:21.023 "data_size": 65536 00:21:21.023 }, 00:21:21.023 { 00:21:21.023 "name": null, 00:21:21.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.023 "is_configured": false, 00:21:21.023 "data_offset": 0, 00:21:21.023 "data_size": 65536 00:21:21.023 }, 00:21:21.023 { 00:21:21.023 "name": "BaseBdev3", 00:21:21.023 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:21.023 "is_configured": true, 00:21:21.023 "data_offset": 0, 00:21:21.023 "data_size": 65536 00:21:21.023 }, 00:21:21.023 { 00:21:21.023 "name": "BaseBdev4", 00:21:21.023 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:21.023 "is_configured": true, 00:21:21.023 "data_offset": 0, 00:21:21.023 "data_size": 65536 00:21:21.023 } 00:21:21.023 ] 00:21:21.023 }' 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.023 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.283 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.283 "name": "raid_bdev1", 00:21:21.283 "uuid": "79fbefd9-9e21-4b21-9909-157885224fac", 00:21:21.283 "strip_size_kb": 0, 00:21:21.283 "state": "online", 00:21:21.283 "raid_level": "raid1", 00:21:21.283 "superblock": false, 00:21:21.283 "num_base_bdevs": 4, 00:21:21.283 "num_base_bdevs_discovered": 3, 00:21:21.283 "num_base_bdevs_operational": 3, 00:21:21.283 "base_bdevs_list": [ 00:21:21.283 { 00:21:21.283 "name": "spare", 00:21:21.283 "uuid": "6576ce00-23b0-52b1-8dc5-72e5cf3a91a3", 00:21:21.283 "is_configured": true, 00:21:21.283 "data_offset": 0, 00:21:21.283 "data_size": 65536 00:21:21.283 }, 00:21:21.283 { 00:21:21.283 "name": null, 00:21:21.283 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.283 "is_configured": false, 00:21:21.283 "data_offset": 0, 00:21:21.283 "data_size": 65536 00:21:21.283 }, 00:21:21.283 { 00:21:21.283 "name": "BaseBdev3", 00:21:21.283 "uuid": "bcd6bed5-442b-5369-833c-7f4678782e09", 00:21:21.283 "is_configured": true, 00:21:21.283 "data_offset": 0, 00:21:21.283 "data_size": 65536 00:21:21.283 }, 00:21:21.283 { 00:21:21.283 "name": "BaseBdev4", 00:21:21.283 "uuid": "fcaa0c72-1cf1-5224-bbb1-dc42b9242eec", 00:21:21.283 "is_configured": true, 00:21:21.283 "data_offset": 0, 00:21:21.283 "data_size": 65536 00:21:21.283 } 00:21:21.283 ] 00:21:21.283 }' 00:21:21.283 13:49:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.283 13:49:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.853 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:22.113 [2024-06-10 13:49:36.411343] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:22.113 [2024-06-10 13:49:36.411361] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:22.113 [2024-06-10 13:49:36.411411] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:22.113 [2024-06-10 13:49:36.411466] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:22.113 [2024-06-10 13:49:36.411472] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28a7330 name raid_bdev1, state offline 00:21:22.113 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.113 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:22.372 /dev/nbd0 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:22.372 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:22.632 1+0 records in 00:21:22.632 1+0 records out 00:21:22.632 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264762 s, 15.5 MB/s 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:22.632 13:49:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:22.632 /dev/nbd1 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:22.632 1+0 records in 00:21:22.632 1+0 records out 00:21:22.632 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273761 s, 15.0 MB/s 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:22.632 13:49:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:22.891 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1640431 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 1640431 ']' 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 1640431 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:23.152 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1640431 00:21:23.412 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:23.412 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:23.412 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1640431' 00:21:23.412 killing process with pid 1640431 00:21:23.412 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 1640431 00:21:23.412 Received shutdown signal, test time was about 60.000000 seconds 00:21:23.412 00:21:23.412 Latency(us) 00:21:23.412 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:23.412 =================================================================================================================== 00:21:23.412 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:23.413 [2024-06-10 13:49:37.663171] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:23.413 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 1640431 00:21:23.413 [2024-06-10 13:49:37.690048] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:23.413 13:49:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:21:23.413 00:21:23.413 real 0m24.147s 00:21:23.413 user 0m31.678s 00:21:23.413 sys 0m4.270s 00:21:23.413 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:23.413 13:49:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.413 ************************************ 00:21:23.413 END TEST raid_rebuild_test 00:21:23.413 ************************************ 00:21:23.413 13:49:37 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:21:23.413 13:49:37 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:21:23.413 13:49:37 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:23.413 13:49:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:23.673 ************************************ 00:21:23.673 START TEST raid_rebuild_test_sb 00:21:23.673 ************************************ 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true false true 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1645408 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1645408 /var/tmp/spdk-raid.sock 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 1645408 ']' 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:23.673 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:23.673 13:49:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:23.673 [2024-06-10 13:49:37.957513] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:21:23.673 [2024-06-10 13:49:37.957565] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1645408 ] 00:21:23.673 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:23.673 Zero copy mechanism will not be used. 00:21:23.673 [2024-06-10 13:49:38.049633] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:23.673 [2024-06-10 13:49:38.119695] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:21:23.933 [2024-06-10 13:49:38.170148] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:23.933 [2024-06-10 13:49:38.170177] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:24.502 13:49:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:24.502 13:49:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:21:24.502 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:24.502 13:49:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:24.761 BaseBdev1_malloc 00:21:24.761 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:24.761 [2024-06-10 13:49:39.205237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:24.761 [2024-06-10 13:49:39.205270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:24.761 [2024-06-10 13:49:39.205286] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2115900 00:21:24.761 [2024-06-10 13:49:39.205293] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:24.761 [2024-06-10 13:49:39.206684] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:24.761 [2024-06-10 13:49:39.206703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:24.761 BaseBdev1 00:21:24.761 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:24.761 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:25.020 BaseBdev2_malloc 00:21:25.020 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:25.279 [2024-06-10 13:49:39.608485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:25.279 [2024-06-10 13:49:39.608514] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.279 [2024-06-10 13:49:39.608525] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21169c0 00:21:25.279 [2024-06-10 13:49:39.608532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.279 [2024-06-10 13:49:39.609784] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.279 [2024-06-10 13:49:39.609803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:25.280 BaseBdev2 00:21:25.280 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:25.280 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:25.539 BaseBdev3_malloc 00:21:25.539 13:49:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:25.539 [2024-06-10 13:49:39.999482] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:25.539 [2024-06-10 13:49:39.999510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.539 [2024-06-10 13:49:39.999522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c0e80 00:21:25.539 [2024-06-10 13:49:39.999529] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.539 [2024-06-10 13:49:40.000776] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.539 [2024-06-10 13:49:40.000794] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:25.539 BaseBdev3 00:21:25.539 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:25.799 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:25.799 BaseBdev4_malloc 00:21:25.799 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:26.060 [2024-06-10 13:49:40.422601] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:26.060 [2024-06-10 13:49:40.422631] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.060 [2024-06-10 13:49:40.422643] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c3b20 00:21:26.060 [2024-06-10 13:49:40.422650] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.060 [2024-06-10 13:49:40.423887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.060 [2024-06-10 13:49:40.423906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:26.060 BaseBdev4 00:21:26.060 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:26.320 spare_malloc 00:21:26.320 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:26.580 spare_delay 00:21:26.580 13:49:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:26.580 [2024-06-10 13:49:41.054167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:26.580 [2024-06-10 13:49:41.054199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.580 [2024-06-10 13:49:41.054212] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c6730 00:21:26.580 [2024-06-10 13:49:41.054219] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.580 [2024-06-10 13:49:41.055493] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.580 [2024-06-10 13:49:41.055514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:26.840 spare 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:26.840 [2024-06-10 13:49:41.254690] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:26.840 [2024-06-10 13:49:41.255750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:26.840 [2024-06-10 13:49:41.255794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:26.840 [2024-06-10 13:49:41.255832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:26.840 [2024-06-10 13:49:41.255990] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22c5330 00:21:26.840 [2024-06-10 13:49:41.255998] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:26.840 [2024-06-10 13:49:41.256154] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c5c70 00:21:26.840 [2024-06-10 13:49:41.256283] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22c5330 00:21:26.840 [2024-06-10 13:49:41.256290] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22c5330 00:21:26.840 [2024-06-10 13:49:41.256363] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.840 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.100 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.100 "name": "raid_bdev1", 00:21:27.100 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:27.100 "strip_size_kb": 0, 00:21:27.100 "state": "online", 00:21:27.100 "raid_level": "raid1", 00:21:27.100 "superblock": true, 00:21:27.100 "num_base_bdevs": 4, 00:21:27.100 "num_base_bdevs_discovered": 4, 00:21:27.100 "num_base_bdevs_operational": 4, 00:21:27.100 "base_bdevs_list": [ 00:21:27.100 { 00:21:27.100 "name": "BaseBdev1", 00:21:27.100 "uuid": "638a5740-0f5c-586a-8791-3f2a8dc8ad2b", 00:21:27.100 "is_configured": true, 00:21:27.100 "data_offset": 2048, 00:21:27.100 "data_size": 63488 00:21:27.100 }, 00:21:27.100 { 00:21:27.100 "name": "BaseBdev2", 00:21:27.100 "uuid": "7575b1b9-9b9d-5027-b005-fa245d170d73", 00:21:27.100 "is_configured": true, 00:21:27.100 "data_offset": 2048, 00:21:27.100 "data_size": 63488 00:21:27.100 }, 00:21:27.100 { 00:21:27.100 "name": "BaseBdev3", 00:21:27.100 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:27.100 "is_configured": true, 00:21:27.100 "data_offset": 2048, 00:21:27.100 "data_size": 63488 00:21:27.100 }, 00:21:27.100 { 00:21:27.100 "name": "BaseBdev4", 00:21:27.100 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:27.100 "is_configured": true, 00:21:27.100 "data_offset": 2048, 00:21:27.100 "data_size": 63488 00:21:27.100 } 00:21:27.100 ] 00:21:27.100 }' 00:21:27.100 13:49:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.100 13:49:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:27.669 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:27.669 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:27.929 [2024-06-10 13:49:42.229394] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:27.929 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:27.929 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.929 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:28.188 [2024-06-10 13:49:42.630205] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c5c70 00:21:28.188 /dev/nbd0 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:28.188 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:28.449 1+0 records in 00:21:28.449 1+0 records out 00:21:28.449 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236003 s, 17.4 MB/s 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:28.449 13:49:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:21:38.446 63488+0 records in 00:21:38.446 63488+0 records out 00:21:38.446 32505856 bytes (33 MB, 31 MiB) copied, 8.91245 s, 3.6 MB/s 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:38.446 [2024-06-10 13:49:51.824723] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:38.446 13:49:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:38.446 [2024-06-10 13:49:52.008740] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.446 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.446 "name": "raid_bdev1", 00:21:38.446 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:38.446 "strip_size_kb": 0, 00:21:38.446 "state": "online", 00:21:38.446 "raid_level": "raid1", 00:21:38.446 "superblock": true, 00:21:38.446 "num_base_bdevs": 4, 00:21:38.446 "num_base_bdevs_discovered": 3, 00:21:38.446 "num_base_bdevs_operational": 3, 00:21:38.446 "base_bdevs_list": [ 00:21:38.446 { 00:21:38.446 "name": null, 00:21:38.446 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.446 "is_configured": false, 00:21:38.446 "data_offset": 2048, 00:21:38.446 "data_size": 63488 00:21:38.446 }, 00:21:38.446 { 00:21:38.446 "name": "BaseBdev2", 00:21:38.446 "uuid": "7575b1b9-9b9d-5027-b005-fa245d170d73", 00:21:38.446 "is_configured": true, 00:21:38.446 "data_offset": 2048, 00:21:38.446 "data_size": 63488 00:21:38.446 }, 00:21:38.446 { 00:21:38.446 "name": "BaseBdev3", 00:21:38.446 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:38.447 "is_configured": true, 00:21:38.447 "data_offset": 2048, 00:21:38.447 "data_size": 63488 00:21:38.447 }, 00:21:38.447 { 00:21:38.447 "name": "BaseBdev4", 00:21:38.447 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:38.447 "is_configured": true, 00:21:38.447 "data_offset": 2048, 00:21:38.447 "data_size": 63488 00:21:38.447 } 00:21:38.447 ] 00:21:38.447 }' 00:21:38.447 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.447 13:49:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:38.447 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:38.706 [2024-06-10 13:49:52.927089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:38.706 [2024-06-10 13:49:52.929981] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c74b0 00:21:38.706 [2024-06-10 13:49:52.931729] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:38.707 13:49:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.645 13:49:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.905 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:39.905 "name": "raid_bdev1", 00:21:39.905 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:39.905 "strip_size_kb": 0, 00:21:39.905 "state": "online", 00:21:39.905 "raid_level": "raid1", 00:21:39.905 "superblock": true, 00:21:39.905 "num_base_bdevs": 4, 00:21:39.905 "num_base_bdevs_discovered": 4, 00:21:39.905 "num_base_bdevs_operational": 4, 00:21:39.905 "process": { 00:21:39.905 "type": "rebuild", 00:21:39.905 "target": "spare", 00:21:39.905 "progress": { 00:21:39.905 "blocks": 24576, 00:21:39.905 "percent": 38 00:21:39.905 } 00:21:39.905 }, 00:21:39.905 "base_bdevs_list": [ 00:21:39.905 { 00:21:39.905 "name": "spare", 00:21:39.905 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:39.905 "is_configured": true, 00:21:39.905 "data_offset": 2048, 00:21:39.905 "data_size": 63488 00:21:39.905 }, 00:21:39.905 { 00:21:39.905 "name": "BaseBdev2", 00:21:39.905 "uuid": "7575b1b9-9b9d-5027-b005-fa245d170d73", 00:21:39.905 "is_configured": true, 00:21:39.905 "data_offset": 2048, 00:21:39.905 "data_size": 63488 00:21:39.905 }, 00:21:39.905 { 00:21:39.905 "name": "BaseBdev3", 00:21:39.905 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:39.905 "is_configured": true, 00:21:39.905 "data_offset": 2048, 00:21:39.905 "data_size": 63488 00:21:39.905 }, 00:21:39.905 { 00:21:39.905 "name": "BaseBdev4", 00:21:39.905 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:39.905 "is_configured": true, 00:21:39.905 "data_offset": 2048, 00:21:39.905 "data_size": 63488 00:21:39.905 } 00:21:39.905 ] 00:21:39.905 }' 00:21:39.905 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:39.905 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:39.905 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:39.905 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:39.905 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:40.166 [2024-06-10 13:49:54.424065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:40.166 [2024-06-10 13:49:54.440710] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:40.166 [2024-06-10 13:49:54.440743] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:40.166 [2024-06-10 13:49:54.440755] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:40.166 [2024-06-10 13:49:54.440760] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.166 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.426 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.426 "name": "raid_bdev1", 00:21:40.426 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:40.426 "strip_size_kb": 0, 00:21:40.426 "state": "online", 00:21:40.426 "raid_level": "raid1", 00:21:40.426 "superblock": true, 00:21:40.426 "num_base_bdevs": 4, 00:21:40.426 "num_base_bdevs_discovered": 3, 00:21:40.426 "num_base_bdevs_operational": 3, 00:21:40.426 "base_bdevs_list": [ 00:21:40.426 { 00:21:40.426 "name": null, 00:21:40.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.426 "is_configured": false, 00:21:40.426 "data_offset": 2048, 00:21:40.426 "data_size": 63488 00:21:40.426 }, 00:21:40.426 { 00:21:40.426 "name": "BaseBdev2", 00:21:40.426 "uuid": "7575b1b9-9b9d-5027-b005-fa245d170d73", 00:21:40.426 "is_configured": true, 00:21:40.426 "data_offset": 2048, 00:21:40.426 "data_size": 63488 00:21:40.426 }, 00:21:40.426 { 00:21:40.426 "name": "BaseBdev3", 00:21:40.426 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:40.426 "is_configured": true, 00:21:40.426 "data_offset": 2048, 00:21:40.426 "data_size": 63488 00:21:40.426 }, 00:21:40.426 { 00:21:40.426 "name": "BaseBdev4", 00:21:40.426 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:40.426 "is_configured": true, 00:21:40.426 "data_offset": 2048, 00:21:40.426 "data_size": 63488 00:21:40.426 } 00:21:40.426 ] 00:21:40.426 }' 00:21:40.426 13:49:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.426 13:49:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:40.996 "name": "raid_bdev1", 00:21:40.996 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:40.996 "strip_size_kb": 0, 00:21:40.996 "state": "online", 00:21:40.996 "raid_level": "raid1", 00:21:40.996 "superblock": true, 00:21:40.996 "num_base_bdevs": 4, 00:21:40.996 "num_base_bdevs_discovered": 3, 00:21:40.996 "num_base_bdevs_operational": 3, 00:21:40.996 "base_bdevs_list": [ 00:21:40.996 { 00:21:40.996 "name": null, 00:21:40.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.996 "is_configured": false, 00:21:40.996 "data_offset": 2048, 00:21:40.996 "data_size": 63488 00:21:40.996 }, 00:21:40.996 { 00:21:40.996 "name": "BaseBdev2", 00:21:40.996 "uuid": "7575b1b9-9b9d-5027-b005-fa245d170d73", 00:21:40.996 "is_configured": true, 00:21:40.996 "data_offset": 2048, 00:21:40.996 "data_size": 63488 00:21:40.996 }, 00:21:40.996 { 00:21:40.996 "name": "BaseBdev3", 00:21:40.996 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:40.996 "is_configured": true, 00:21:40.996 "data_offset": 2048, 00:21:40.996 "data_size": 63488 00:21:40.996 }, 00:21:40.996 { 00:21:40.996 "name": "BaseBdev4", 00:21:40.996 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:40.996 "is_configured": true, 00:21:40.996 "data_offset": 2048, 00:21:40.996 "data_size": 63488 00:21:40.996 } 00:21:40.996 ] 00:21:40.996 }' 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:40.996 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:41.256 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:41.256 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:41.256 [2024-06-10 13:49:55.687611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:41.256 [2024-06-10 13:49:55.690467] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c5640 00:21:41.256 [2024-06-10 13:49:55.691710] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:41.256 13:49:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:42.640 "name": "raid_bdev1", 00:21:42.640 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:42.640 "strip_size_kb": 0, 00:21:42.640 "state": "online", 00:21:42.640 "raid_level": "raid1", 00:21:42.640 "superblock": true, 00:21:42.640 "num_base_bdevs": 4, 00:21:42.640 "num_base_bdevs_discovered": 4, 00:21:42.640 "num_base_bdevs_operational": 4, 00:21:42.640 "process": { 00:21:42.640 "type": "rebuild", 00:21:42.640 "target": "spare", 00:21:42.640 "progress": { 00:21:42.640 "blocks": 24576, 00:21:42.640 "percent": 38 00:21:42.640 } 00:21:42.640 }, 00:21:42.640 "base_bdevs_list": [ 00:21:42.640 { 00:21:42.640 "name": "spare", 00:21:42.640 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:42.640 "is_configured": true, 00:21:42.640 "data_offset": 2048, 00:21:42.640 "data_size": 63488 00:21:42.640 }, 00:21:42.640 { 00:21:42.640 "name": "BaseBdev2", 00:21:42.640 "uuid": "7575b1b9-9b9d-5027-b005-fa245d170d73", 00:21:42.640 "is_configured": true, 00:21:42.640 "data_offset": 2048, 00:21:42.640 "data_size": 63488 00:21:42.640 }, 00:21:42.640 { 00:21:42.640 "name": "BaseBdev3", 00:21:42.640 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:42.640 "is_configured": true, 00:21:42.640 "data_offset": 2048, 00:21:42.640 "data_size": 63488 00:21:42.640 }, 00:21:42.640 { 00:21:42.640 "name": "BaseBdev4", 00:21:42.640 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:42.640 "is_configured": true, 00:21:42.640 "data_offset": 2048, 00:21:42.640 "data_size": 63488 00:21:42.640 } 00:21:42.640 ] 00:21:42.640 }' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:42.640 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:21:42.640 13:49:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:42.901 [2024-06-10 13:49:57.176064] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:42.901 [2024-06-10 13:49:57.301252] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22c5640 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.901 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:43.161 "name": "raid_bdev1", 00:21:43.161 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:43.161 "strip_size_kb": 0, 00:21:43.161 "state": "online", 00:21:43.161 "raid_level": "raid1", 00:21:43.161 "superblock": true, 00:21:43.161 "num_base_bdevs": 4, 00:21:43.161 "num_base_bdevs_discovered": 3, 00:21:43.161 "num_base_bdevs_operational": 3, 00:21:43.161 "process": { 00:21:43.161 "type": "rebuild", 00:21:43.161 "target": "spare", 00:21:43.161 "progress": { 00:21:43.161 "blocks": 34816, 00:21:43.161 "percent": 54 00:21:43.161 } 00:21:43.161 }, 00:21:43.161 "base_bdevs_list": [ 00:21:43.161 { 00:21:43.161 "name": "spare", 00:21:43.161 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:43.161 "is_configured": true, 00:21:43.161 "data_offset": 2048, 00:21:43.161 "data_size": 63488 00:21:43.161 }, 00:21:43.161 { 00:21:43.161 "name": null, 00:21:43.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.161 "is_configured": false, 00:21:43.161 "data_offset": 2048, 00:21:43.161 "data_size": 63488 00:21:43.161 }, 00:21:43.161 { 00:21:43.161 "name": "BaseBdev3", 00:21:43.161 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:43.161 "is_configured": true, 00:21:43.161 "data_offset": 2048, 00:21:43.161 "data_size": 63488 00:21:43.161 }, 00:21:43.161 { 00:21:43.161 "name": "BaseBdev4", 00:21:43.161 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:43.161 "is_configured": true, 00:21:43.161 "data_offset": 2048, 00:21:43.161 "data_size": 63488 00:21:43.161 } 00:21:43.161 ] 00:21:43.161 }' 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=785 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:43.161 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:43.162 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:43.162 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.162 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.421 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:43.421 "name": "raid_bdev1", 00:21:43.421 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:43.421 "strip_size_kb": 0, 00:21:43.421 "state": "online", 00:21:43.421 "raid_level": "raid1", 00:21:43.421 "superblock": true, 00:21:43.421 "num_base_bdevs": 4, 00:21:43.421 "num_base_bdevs_discovered": 3, 00:21:43.421 "num_base_bdevs_operational": 3, 00:21:43.421 "process": { 00:21:43.421 "type": "rebuild", 00:21:43.421 "target": "spare", 00:21:43.421 "progress": { 00:21:43.421 "blocks": 40960, 00:21:43.421 "percent": 64 00:21:43.421 } 00:21:43.421 }, 00:21:43.421 "base_bdevs_list": [ 00:21:43.421 { 00:21:43.421 "name": "spare", 00:21:43.421 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:43.421 "is_configured": true, 00:21:43.421 "data_offset": 2048, 00:21:43.421 "data_size": 63488 00:21:43.421 }, 00:21:43.421 { 00:21:43.422 "name": null, 00:21:43.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.422 "is_configured": false, 00:21:43.422 "data_offset": 2048, 00:21:43.422 "data_size": 63488 00:21:43.422 }, 00:21:43.422 { 00:21:43.422 "name": "BaseBdev3", 00:21:43.422 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:43.422 "is_configured": true, 00:21:43.422 "data_offset": 2048, 00:21:43.422 "data_size": 63488 00:21:43.422 }, 00:21:43.422 { 00:21:43.422 "name": "BaseBdev4", 00:21:43.422 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:43.422 "is_configured": true, 00:21:43.422 "data_offset": 2048, 00:21:43.422 "data_size": 63488 00:21:43.422 } 00:21:43.422 ] 00:21:43.422 }' 00:21:43.422 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:43.422 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:43.422 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:43.683 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:43.683 13:49:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:44.622 [2024-06-10 13:49:58.910809] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:44.622 [2024-06-10 13:49:58.910856] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:44.622 [2024-06-10 13:49:58.910935] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.622 13:49:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:44.882 "name": "raid_bdev1", 00:21:44.882 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:44.882 "strip_size_kb": 0, 00:21:44.882 "state": "online", 00:21:44.882 "raid_level": "raid1", 00:21:44.882 "superblock": true, 00:21:44.882 "num_base_bdevs": 4, 00:21:44.882 "num_base_bdevs_discovered": 3, 00:21:44.882 "num_base_bdevs_operational": 3, 00:21:44.882 "base_bdevs_list": [ 00:21:44.882 { 00:21:44.882 "name": "spare", 00:21:44.882 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:44.882 "is_configured": true, 00:21:44.882 "data_offset": 2048, 00:21:44.882 "data_size": 63488 00:21:44.882 }, 00:21:44.882 { 00:21:44.882 "name": null, 00:21:44.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.882 "is_configured": false, 00:21:44.882 "data_offset": 2048, 00:21:44.882 "data_size": 63488 00:21:44.882 }, 00:21:44.882 { 00:21:44.882 "name": "BaseBdev3", 00:21:44.882 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:44.882 "is_configured": true, 00:21:44.882 "data_offset": 2048, 00:21:44.882 "data_size": 63488 00:21:44.882 }, 00:21:44.882 { 00:21:44.882 "name": "BaseBdev4", 00:21:44.882 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:44.882 "is_configured": true, 00:21:44.882 "data_offset": 2048, 00:21:44.882 "data_size": 63488 00:21:44.882 } 00:21:44.882 ] 00:21:44.882 }' 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.882 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:45.142 "name": "raid_bdev1", 00:21:45.142 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:45.142 "strip_size_kb": 0, 00:21:45.142 "state": "online", 00:21:45.142 "raid_level": "raid1", 00:21:45.142 "superblock": true, 00:21:45.142 "num_base_bdevs": 4, 00:21:45.142 "num_base_bdevs_discovered": 3, 00:21:45.142 "num_base_bdevs_operational": 3, 00:21:45.142 "base_bdevs_list": [ 00:21:45.142 { 00:21:45.142 "name": "spare", 00:21:45.142 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:45.142 "is_configured": true, 00:21:45.142 "data_offset": 2048, 00:21:45.142 "data_size": 63488 00:21:45.142 }, 00:21:45.142 { 00:21:45.142 "name": null, 00:21:45.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.142 "is_configured": false, 00:21:45.142 "data_offset": 2048, 00:21:45.142 "data_size": 63488 00:21:45.142 }, 00:21:45.142 { 00:21:45.142 "name": "BaseBdev3", 00:21:45.142 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:45.142 "is_configured": true, 00:21:45.142 "data_offset": 2048, 00:21:45.142 "data_size": 63488 00:21:45.142 }, 00:21:45.142 { 00:21:45.142 "name": "BaseBdev4", 00:21:45.142 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:45.142 "is_configured": true, 00:21:45.142 "data_offset": 2048, 00:21:45.142 "data_size": 63488 00:21:45.142 } 00:21:45.142 ] 00:21:45.142 }' 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.142 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.403 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.403 "name": "raid_bdev1", 00:21:45.403 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:45.403 "strip_size_kb": 0, 00:21:45.403 "state": "online", 00:21:45.403 "raid_level": "raid1", 00:21:45.403 "superblock": true, 00:21:45.403 "num_base_bdevs": 4, 00:21:45.403 "num_base_bdevs_discovered": 3, 00:21:45.403 "num_base_bdevs_operational": 3, 00:21:45.403 "base_bdevs_list": [ 00:21:45.403 { 00:21:45.403 "name": "spare", 00:21:45.403 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:45.403 "is_configured": true, 00:21:45.403 "data_offset": 2048, 00:21:45.403 "data_size": 63488 00:21:45.403 }, 00:21:45.403 { 00:21:45.403 "name": null, 00:21:45.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.403 "is_configured": false, 00:21:45.403 "data_offset": 2048, 00:21:45.403 "data_size": 63488 00:21:45.403 }, 00:21:45.403 { 00:21:45.403 "name": "BaseBdev3", 00:21:45.403 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:45.403 "is_configured": true, 00:21:45.403 "data_offset": 2048, 00:21:45.403 "data_size": 63488 00:21:45.403 }, 00:21:45.403 { 00:21:45.403 "name": "BaseBdev4", 00:21:45.403 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:45.403 "is_configured": true, 00:21:45.403 "data_offset": 2048, 00:21:45.403 "data_size": 63488 00:21:45.403 } 00:21:45.403 ] 00:21:45.403 }' 00:21:45.403 13:49:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.403 13:49:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:45.974 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:45.974 [2024-06-10 13:50:00.381299] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:45.974 [2024-06-10 13:50:00.381317] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:45.974 [2024-06-10 13:50:00.381364] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:45.974 [2024-06-10 13:50:00.381418] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:45.974 [2024-06-10 13:50:00.381425] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22c5330 name raid_bdev1, state offline 00:21:45.974 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.974 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:46.234 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:46.235 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:46.235 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:46.235 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:46.495 /dev/nbd0 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:46.495 1+0 records in 00:21:46.495 1+0 records out 00:21:46.495 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294181 s, 13.9 MB/s 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:46.495 13:50:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:46.755 /dev/nbd1 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:46.755 1+0 records in 00:21:46.755 1+0 records out 00:21:46.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284186 s, 14.4 MB/s 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:46.755 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:47.062 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:47.353 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:47.620 [2024-06-10 13:50:01.911681] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:47.620 [2024-06-10 13:50:01.911717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.620 [2024-06-10 13:50:01.911732] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22c6960 00:21:47.620 [2024-06-10 13:50:01.911740] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.620 [2024-06-10 13:50:01.913140] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.620 [2024-06-10 13:50:01.913169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:47.620 [2024-06-10 13:50:01.913231] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:47.620 [2024-06-10 13:50:01.913252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:47.620 [2024-06-10 13:50:01.913335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:47.620 [2024-06-10 13:50:01.913397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:47.620 spare 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.620 13:50:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.620 [2024-06-10 13:50:02.013691] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22bfe20 00:21:47.620 [2024-06-10 13:50:02.013700] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:47.620 [2024-06-10 13:50:02.013854] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c03c0 00:21:47.620 [2024-06-10 13:50:02.013971] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22bfe20 00:21:47.620 [2024-06-10 13:50:02.013977] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22bfe20 00:21:47.620 [2024-06-10 13:50:02.014053] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:47.620 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.620 "name": "raid_bdev1", 00:21:47.620 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:47.620 "strip_size_kb": 0, 00:21:47.620 "state": "online", 00:21:47.620 "raid_level": "raid1", 00:21:47.620 "superblock": true, 00:21:47.620 "num_base_bdevs": 4, 00:21:47.620 "num_base_bdevs_discovered": 3, 00:21:47.620 "num_base_bdevs_operational": 3, 00:21:47.620 "base_bdevs_list": [ 00:21:47.620 { 00:21:47.620 "name": "spare", 00:21:47.620 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:47.620 "is_configured": true, 00:21:47.620 "data_offset": 2048, 00:21:47.620 "data_size": 63488 00:21:47.620 }, 00:21:47.620 { 00:21:47.620 "name": null, 00:21:47.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.620 "is_configured": false, 00:21:47.620 "data_offset": 2048, 00:21:47.620 "data_size": 63488 00:21:47.620 }, 00:21:47.620 { 00:21:47.620 "name": "BaseBdev3", 00:21:47.620 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:47.620 "is_configured": true, 00:21:47.620 "data_offset": 2048, 00:21:47.620 "data_size": 63488 00:21:47.620 }, 00:21:47.620 { 00:21:47.620 "name": "BaseBdev4", 00:21:47.620 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:47.620 "is_configured": true, 00:21:47.620 "data_offset": 2048, 00:21:47.620 "data_size": 63488 00:21:47.620 } 00:21:47.620 ] 00:21:47.620 }' 00:21:47.620 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.620 13:50:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.190 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:48.450 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:48.450 "name": "raid_bdev1", 00:21:48.450 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:48.450 "strip_size_kb": 0, 00:21:48.450 "state": "online", 00:21:48.450 "raid_level": "raid1", 00:21:48.450 "superblock": true, 00:21:48.450 "num_base_bdevs": 4, 00:21:48.450 "num_base_bdevs_discovered": 3, 00:21:48.450 "num_base_bdevs_operational": 3, 00:21:48.450 "base_bdevs_list": [ 00:21:48.450 { 00:21:48.450 "name": "spare", 00:21:48.450 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:48.450 "is_configured": true, 00:21:48.450 "data_offset": 2048, 00:21:48.450 "data_size": 63488 00:21:48.450 }, 00:21:48.450 { 00:21:48.450 "name": null, 00:21:48.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.450 "is_configured": false, 00:21:48.450 "data_offset": 2048, 00:21:48.450 "data_size": 63488 00:21:48.450 }, 00:21:48.450 { 00:21:48.450 "name": "BaseBdev3", 00:21:48.450 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:48.450 "is_configured": true, 00:21:48.450 "data_offset": 2048, 00:21:48.450 "data_size": 63488 00:21:48.450 }, 00:21:48.450 { 00:21:48.450 "name": "BaseBdev4", 00:21:48.450 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:48.450 "is_configured": true, 00:21:48.450 "data_offset": 2048, 00:21:48.450 "data_size": 63488 00:21:48.450 } 00:21:48.450 ] 00:21:48.450 }' 00:21:48.450 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:48.450 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:48.450 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:48.710 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:48.710 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.710 13:50:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:48.710 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:48.710 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:48.970 [2024-06-10 13:50:03.343385] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.970 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.229 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.229 "name": "raid_bdev1", 00:21:49.229 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:49.229 "strip_size_kb": 0, 00:21:49.230 "state": "online", 00:21:49.230 "raid_level": "raid1", 00:21:49.230 "superblock": true, 00:21:49.230 "num_base_bdevs": 4, 00:21:49.230 "num_base_bdevs_discovered": 2, 00:21:49.230 "num_base_bdevs_operational": 2, 00:21:49.230 "base_bdevs_list": [ 00:21:49.230 { 00:21:49.230 "name": null, 00:21:49.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.230 "is_configured": false, 00:21:49.230 "data_offset": 2048, 00:21:49.230 "data_size": 63488 00:21:49.230 }, 00:21:49.230 { 00:21:49.230 "name": null, 00:21:49.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.230 "is_configured": false, 00:21:49.230 "data_offset": 2048, 00:21:49.230 "data_size": 63488 00:21:49.230 }, 00:21:49.230 { 00:21:49.230 "name": "BaseBdev3", 00:21:49.230 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:49.230 "is_configured": true, 00:21:49.230 "data_offset": 2048, 00:21:49.230 "data_size": 63488 00:21:49.230 }, 00:21:49.230 { 00:21:49.230 "name": "BaseBdev4", 00:21:49.230 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:49.230 "is_configured": true, 00:21:49.230 "data_offset": 2048, 00:21:49.230 "data_size": 63488 00:21:49.230 } 00:21:49.230 ] 00:21:49.230 }' 00:21:49.230 13:50:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.230 13:50:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:49.800 13:50:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:50.061 [2024-06-10 13:50:04.293803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:50.061 [2024-06-10 13:50:04.293913] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:50.061 [2024-06-10 13:50:04.293922] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:50.061 [2024-06-10 13:50:04.293940] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:50.061 [2024-06-10 13:50:04.296671] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x224bb70 00:21:50.061 [2024-06-10 13:50:04.298392] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:50.061 13:50:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.001 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.261 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:51.261 "name": "raid_bdev1", 00:21:51.261 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:51.261 "strip_size_kb": 0, 00:21:51.261 "state": "online", 00:21:51.261 "raid_level": "raid1", 00:21:51.261 "superblock": true, 00:21:51.261 "num_base_bdevs": 4, 00:21:51.261 "num_base_bdevs_discovered": 3, 00:21:51.261 "num_base_bdevs_operational": 3, 00:21:51.261 "process": { 00:21:51.261 "type": "rebuild", 00:21:51.261 "target": "spare", 00:21:51.261 "progress": { 00:21:51.261 "blocks": 24576, 00:21:51.261 "percent": 38 00:21:51.261 } 00:21:51.261 }, 00:21:51.261 "base_bdevs_list": [ 00:21:51.261 { 00:21:51.261 "name": "spare", 00:21:51.261 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:51.261 "is_configured": true, 00:21:51.261 "data_offset": 2048, 00:21:51.261 "data_size": 63488 00:21:51.261 }, 00:21:51.261 { 00:21:51.261 "name": null, 00:21:51.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.261 "is_configured": false, 00:21:51.261 "data_offset": 2048, 00:21:51.261 "data_size": 63488 00:21:51.261 }, 00:21:51.261 { 00:21:51.261 "name": "BaseBdev3", 00:21:51.261 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:51.261 "is_configured": true, 00:21:51.261 "data_offset": 2048, 00:21:51.261 "data_size": 63488 00:21:51.261 }, 00:21:51.261 { 00:21:51.261 "name": "BaseBdev4", 00:21:51.261 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:51.261 "is_configured": true, 00:21:51.261 "data_offset": 2048, 00:21:51.261 "data_size": 63488 00:21:51.261 } 00:21:51.261 ] 00:21:51.261 }' 00:21:51.261 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:51.261 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:51.261 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:51.261 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:51.261 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:51.521 [2024-06-10 13:50:05.807656] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:51.521 [2024-06-10 13:50:05.907933] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:51.521 [2024-06-10 13:50:05.907965] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.521 [2024-06-10 13:50:05.907977] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:51.521 [2024-06-10 13:50:05.907982] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.521 13:50:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.781 13:50:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.781 "name": "raid_bdev1", 00:21:51.781 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:51.781 "strip_size_kb": 0, 00:21:51.781 "state": "online", 00:21:51.781 "raid_level": "raid1", 00:21:51.781 "superblock": true, 00:21:51.781 "num_base_bdevs": 4, 00:21:51.781 "num_base_bdevs_discovered": 2, 00:21:51.781 "num_base_bdevs_operational": 2, 00:21:51.781 "base_bdevs_list": [ 00:21:51.781 { 00:21:51.781 "name": null, 00:21:51.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.781 "is_configured": false, 00:21:51.781 "data_offset": 2048, 00:21:51.781 "data_size": 63488 00:21:51.781 }, 00:21:51.781 { 00:21:51.781 "name": null, 00:21:51.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.781 "is_configured": false, 00:21:51.781 "data_offset": 2048, 00:21:51.781 "data_size": 63488 00:21:51.781 }, 00:21:51.781 { 00:21:51.781 "name": "BaseBdev3", 00:21:51.781 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:51.781 "is_configured": true, 00:21:51.781 "data_offset": 2048, 00:21:51.781 "data_size": 63488 00:21:51.781 }, 00:21:51.781 { 00:21:51.781 "name": "BaseBdev4", 00:21:51.781 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:51.781 "is_configured": true, 00:21:51.781 "data_offset": 2048, 00:21:51.781 "data_size": 63488 00:21:51.781 } 00:21:51.781 ] 00:21:51.781 }' 00:21:51.781 13:50:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.781 13:50:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:52.353 13:50:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:52.613 [2024-06-10 13:50:06.898411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:52.613 [2024-06-10 13:50:06.898448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.613 [2024-06-10 13:50:06.898466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b2da0 00:21:52.613 [2024-06-10 13:50:06.898474] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.613 [2024-06-10 13:50:06.898798] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.613 [2024-06-10 13:50:06.898810] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:52.613 [2024-06-10 13:50:06.898869] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:52.613 [2024-06-10 13:50:06.898876] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:52.613 [2024-06-10 13:50:06.898882] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:52.613 [2024-06-10 13:50:06.898894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:52.613 [2024-06-10 13:50:06.901671] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b1f10 00:21:52.613 [2024-06-10 13:50:06.902889] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:52.613 spare 00:21:52.613 13:50:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:53.554 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:53.555 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:53.555 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:53.555 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:53.555 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:53.555 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.555 13:50:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.816 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:53.816 "name": "raid_bdev1", 00:21:53.816 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:53.816 "strip_size_kb": 0, 00:21:53.816 "state": "online", 00:21:53.816 "raid_level": "raid1", 00:21:53.816 "superblock": true, 00:21:53.816 "num_base_bdevs": 4, 00:21:53.816 "num_base_bdevs_discovered": 3, 00:21:53.816 "num_base_bdevs_operational": 3, 00:21:53.816 "process": { 00:21:53.816 "type": "rebuild", 00:21:53.816 "target": "spare", 00:21:53.816 "progress": { 00:21:53.816 "blocks": 24576, 00:21:53.816 "percent": 38 00:21:53.816 } 00:21:53.816 }, 00:21:53.816 "base_bdevs_list": [ 00:21:53.816 { 00:21:53.816 "name": "spare", 00:21:53.816 "uuid": "dda99f09-115b-5b42-a04d-c97334a3f1cb", 00:21:53.816 "is_configured": true, 00:21:53.816 "data_offset": 2048, 00:21:53.816 "data_size": 63488 00:21:53.816 }, 00:21:53.816 { 00:21:53.816 "name": null, 00:21:53.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.816 "is_configured": false, 00:21:53.816 "data_offset": 2048, 00:21:53.816 "data_size": 63488 00:21:53.816 }, 00:21:53.816 { 00:21:53.816 "name": "BaseBdev3", 00:21:53.816 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:53.816 "is_configured": true, 00:21:53.816 "data_offset": 2048, 00:21:53.816 "data_size": 63488 00:21:53.816 }, 00:21:53.816 { 00:21:53.816 "name": "BaseBdev4", 00:21:53.816 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:53.816 "is_configured": true, 00:21:53.816 "data_offset": 2048, 00:21:53.816 "data_size": 63488 00:21:53.816 } 00:21:53.816 ] 00:21:53.816 }' 00:21:53.816 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:53.817 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:53.817 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:53.817 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:53.817 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:54.077 [2024-06-10 13:50:08.391273] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:54.077 [2024-06-10 13:50:08.412249] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:54.077 [2024-06-10 13:50:08.412278] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:54.077 [2024-06-10 13:50:08.412289] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:54.078 [2024-06-10 13:50:08.412293] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.078 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.338 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.338 "name": "raid_bdev1", 00:21:54.338 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:54.338 "strip_size_kb": 0, 00:21:54.338 "state": "online", 00:21:54.338 "raid_level": "raid1", 00:21:54.338 "superblock": true, 00:21:54.338 "num_base_bdevs": 4, 00:21:54.338 "num_base_bdevs_discovered": 2, 00:21:54.338 "num_base_bdevs_operational": 2, 00:21:54.338 "base_bdevs_list": [ 00:21:54.338 { 00:21:54.338 "name": null, 00:21:54.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.338 "is_configured": false, 00:21:54.338 "data_offset": 2048, 00:21:54.338 "data_size": 63488 00:21:54.338 }, 00:21:54.338 { 00:21:54.338 "name": null, 00:21:54.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.338 "is_configured": false, 00:21:54.338 "data_offset": 2048, 00:21:54.338 "data_size": 63488 00:21:54.338 }, 00:21:54.338 { 00:21:54.338 "name": "BaseBdev3", 00:21:54.338 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:54.338 "is_configured": true, 00:21:54.338 "data_offset": 2048, 00:21:54.338 "data_size": 63488 00:21:54.338 }, 00:21:54.338 { 00:21:54.338 "name": "BaseBdev4", 00:21:54.338 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:54.338 "is_configured": true, 00:21:54.338 "data_offset": 2048, 00:21:54.338 "data_size": 63488 00:21:54.338 } 00:21:54.338 ] 00:21:54.338 }' 00:21:54.338 13:50:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.338 13:50:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:54.934 "name": "raid_bdev1", 00:21:54.934 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:54.934 "strip_size_kb": 0, 00:21:54.934 "state": "online", 00:21:54.934 "raid_level": "raid1", 00:21:54.934 "superblock": true, 00:21:54.934 "num_base_bdevs": 4, 00:21:54.934 "num_base_bdevs_discovered": 2, 00:21:54.934 "num_base_bdevs_operational": 2, 00:21:54.934 "base_bdevs_list": [ 00:21:54.934 { 00:21:54.934 "name": null, 00:21:54.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.934 "is_configured": false, 00:21:54.934 "data_offset": 2048, 00:21:54.934 "data_size": 63488 00:21:54.934 }, 00:21:54.934 { 00:21:54.934 "name": null, 00:21:54.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.934 "is_configured": false, 00:21:54.934 "data_offset": 2048, 00:21:54.934 "data_size": 63488 00:21:54.934 }, 00:21:54.934 { 00:21:54.934 "name": "BaseBdev3", 00:21:54.934 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:54.934 "is_configured": true, 00:21:54.934 "data_offset": 2048, 00:21:54.934 "data_size": 63488 00:21:54.934 }, 00:21:54.934 { 00:21:54.934 "name": "BaseBdev4", 00:21:54.934 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:54.934 "is_configured": true, 00:21:54.934 "data_offset": 2048, 00:21:54.934 "data_size": 63488 00:21:54.934 } 00:21:54.934 ] 00:21:54.934 }' 00:21:54.934 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:55.195 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:55.195 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:55.195 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:55.195 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:55.456 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:55.456 [2024-06-10 13:50:09.883609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:55.456 [2024-06-10 13:50:09.883640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.456 [2024-06-10 13:50:09.883653] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b3990 00:21:55.457 [2024-06-10 13:50:09.883660] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.457 [2024-06-10 13:50:09.883952] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.457 [2024-06-10 13:50:09.883965] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:55.457 [2024-06-10 13:50:09.884011] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:55.457 [2024-06-10 13:50:09.884019] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:55.457 [2024-06-10 13:50:09.884025] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:55.457 BaseBdev1 00:21:55.457 13:50:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.840 13:50:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.840 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.840 "name": "raid_bdev1", 00:21:56.840 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:56.840 "strip_size_kb": 0, 00:21:56.840 "state": "online", 00:21:56.840 "raid_level": "raid1", 00:21:56.840 "superblock": true, 00:21:56.840 "num_base_bdevs": 4, 00:21:56.840 "num_base_bdevs_discovered": 2, 00:21:56.840 "num_base_bdevs_operational": 2, 00:21:56.840 "base_bdevs_list": [ 00:21:56.840 { 00:21:56.840 "name": null, 00:21:56.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.840 "is_configured": false, 00:21:56.840 "data_offset": 2048, 00:21:56.840 "data_size": 63488 00:21:56.840 }, 00:21:56.840 { 00:21:56.840 "name": null, 00:21:56.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.840 "is_configured": false, 00:21:56.840 "data_offset": 2048, 00:21:56.840 "data_size": 63488 00:21:56.840 }, 00:21:56.840 { 00:21:56.840 "name": "BaseBdev3", 00:21:56.840 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:56.840 "is_configured": true, 00:21:56.840 "data_offset": 2048, 00:21:56.840 "data_size": 63488 00:21:56.840 }, 00:21:56.840 { 00:21:56.840 "name": "BaseBdev4", 00:21:56.840 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:56.840 "is_configured": true, 00:21:56.840 "data_offset": 2048, 00:21:56.840 "data_size": 63488 00:21:56.840 } 00:21:56.840 ] 00:21:56.840 }' 00:21:56.840 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.840 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:57.411 "name": "raid_bdev1", 00:21:57.411 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:57.411 "strip_size_kb": 0, 00:21:57.411 "state": "online", 00:21:57.411 "raid_level": "raid1", 00:21:57.411 "superblock": true, 00:21:57.411 "num_base_bdevs": 4, 00:21:57.411 "num_base_bdevs_discovered": 2, 00:21:57.411 "num_base_bdevs_operational": 2, 00:21:57.411 "base_bdevs_list": [ 00:21:57.411 { 00:21:57.411 "name": null, 00:21:57.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.411 "is_configured": false, 00:21:57.411 "data_offset": 2048, 00:21:57.411 "data_size": 63488 00:21:57.411 }, 00:21:57.411 { 00:21:57.411 "name": null, 00:21:57.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.411 "is_configured": false, 00:21:57.411 "data_offset": 2048, 00:21:57.411 "data_size": 63488 00:21:57.411 }, 00:21:57.411 { 00:21:57.411 "name": "BaseBdev3", 00:21:57.411 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:57.411 "is_configured": true, 00:21:57.411 "data_offset": 2048, 00:21:57.411 "data_size": 63488 00:21:57.411 }, 00:21:57.411 { 00:21:57.411 "name": "BaseBdev4", 00:21:57.411 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:57.411 "is_configured": true, 00:21:57.411 "data_offset": 2048, 00:21:57.411 "data_size": 63488 00:21:57.411 } 00:21:57.411 ] 00:21:57.411 }' 00:21:57.411 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:57.671 13:50:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:57.931 [2024-06-10 13:50:12.157396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:57.931 [2024-06-10 13:50:12.157494] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:57.931 [2024-06-10 13:50:12.157509] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:57.931 request: 00:21:57.931 { 00:21:57.931 "raid_bdev": "raid_bdev1", 00:21:57.931 "base_bdev": "BaseBdev1", 00:21:57.931 "method": "bdev_raid_add_base_bdev", 00:21:57.931 "req_id": 1 00:21:57.931 } 00:21:57.931 Got JSON-RPC error response 00:21:57.931 response: 00:21:57.931 { 00:21:57.931 "code": -22, 00:21:57.931 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:57.931 } 00:21:57.931 13:50:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:21:57.931 13:50:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:21:57.931 13:50:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:21:57.931 13:50:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:21:57.931 13:50:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.871 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.132 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.132 "name": "raid_bdev1", 00:21:59.132 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:59.132 "strip_size_kb": 0, 00:21:59.132 "state": "online", 00:21:59.132 "raid_level": "raid1", 00:21:59.132 "superblock": true, 00:21:59.132 "num_base_bdevs": 4, 00:21:59.132 "num_base_bdevs_discovered": 2, 00:21:59.132 "num_base_bdevs_operational": 2, 00:21:59.132 "base_bdevs_list": [ 00:21:59.132 { 00:21:59.132 "name": null, 00:21:59.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.132 "is_configured": false, 00:21:59.132 "data_offset": 2048, 00:21:59.132 "data_size": 63488 00:21:59.132 }, 00:21:59.132 { 00:21:59.132 "name": null, 00:21:59.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.132 "is_configured": false, 00:21:59.132 "data_offset": 2048, 00:21:59.132 "data_size": 63488 00:21:59.132 }, 00:21:59.132 { 00:21:59.132 "name": "BaseBdev3", 00:21:59.132 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:59.132 "is_configured": true, 00:21:59.132 "data_offset": 2048, 00:21:59.132 "data_size": 63488 00:21:59.132 }, 00:21:59.132 { 00:21:59.132 "name": "BaseBdev4", 00:21:59.132 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:59.132 "is_configured": true, 00:21:59.132 "data_offset": 2048, 00:21:59.132 "data_size": 63488 00:21:59.132 } 00:21:59.132 ] 00:21:59.132 }' 00:21:59.132 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.132 13:50:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.701 13:50:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.701 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:59.701 "name": "raid_bdev1", 00:21:59.701 "uuid": "4f1db5b6-f6c6-45ea-87fa-2360ce6221ea", 00:21:59.701 "strip_size_kb": 0, 00:21:59.701 "state": "online", 00:21:59.701 "raid_level": "raid1", 00:21:59.701 "superblock": true, 00:21:59.701 "num_base_bdevs": 4, 00:21:59.701 "num_base_bdevs_discovered": 2, 00:21:59.701 "num_base_bdevs_operational": 2, 00:21:59.701 "base_bdevs_list": [ 00:21:59.701 { 00:21:59.701 "name": null, 00:21:59.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.701 "is_configured": false, 00:21:59.701 "data_offset": 2048, 00:21:59.701 "data_size": 63488 00:21:59.701 }, 00:21:59.701 { 00:21:59.701 "name": null, 00:21:59.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.701 "is_configured": false, 00:21:59.701 "data_offset": 2048, 00:21:59.701 "data_size": 63488 00:21:59.701 }, 00:21:59.701 { 00:21:59.701 "name": "BaseBdev3", 00:21:59.701 "uuid": "a8b7ac7f-1c9a-5c57-b75e-d227f151c748", 00:21:59.701 "is_configured": true, 00:21:59.701 "data_offset": 2048, 00:21:59.701 "data_size": 63488 00:21:59.701 }, 00:21:59.701 { 00:21:59.701 "name": "BaseBdev4", 00:21:59.701 "uuid": "d514c9ec-cbf9-596a-8e2e-ad17c405b7f8", 00:21:59.701 "is_configured": true, 00:21:59.701 "data_offset": 2048, 00:21:59.701 "data_size": 63488 00:21:59.701 } 00:21:59.701 ] 00:21:59.701 }' 00:21:59.701 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1645408 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 1645408 ']' 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 1645408 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1645408 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1645408' 00:21:59.961 killing process with pid 1645408 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 1645408 00:21:59.961 Received shutdown signal, test time was about 60.000000 seconds 00:21:59.961 00:21:59.961 Latency(us) 00:21:59.961 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:59.961 =================================================================================================================== 00:21:59.961 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:59.961 [2024-06-10 13:50:14.283713] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:59.961 [2024-06-10 13:50:14.283789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:59.961 [2024-06-10 13:50:14.283838] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:59.961 [2024-06-10 13:50:14.283845] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22bfe20 name raid_bdev1, state offline 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 1645408 00:21:59.961 [2024-06-10 13:50:14.311155] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:21:59.961 00:21:59.961 real 0m36.540s 00:21:59.961 user 0m52.064s 00:21:59.961 sys 0m5.325s 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:59.961 13:50:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:59.961 ************************************ 00:21:59.961 END TEST raid_rebuild_test_sb 00:21:59.961 ************************************ 00:22:00.221 13:50:14 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:22:00.221 13:50:14 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:00.221 13:50:14 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:00.221 13:50:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:00.221 ************************************ 00:22:00.221 START TEST raid_rebuild_test_io 00:22:00.221 ************************************ 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false true true 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1652791 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1652791 /var/tmp/spdk-raid.sock 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 1652791 ']' 00:22:00.221 13:50:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:00.222 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:00.222 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:00.222 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:00.222 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:00.222 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:00.222 13:50:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:00.222 [2024-06-10 13:50:14.579496] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:22:00.222 [2024-06-10 13:50:14.579543] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652791 ] 00:22:00.222 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:00.222 Zero copy mechanism will not be used. 00:22:00.222 [2024-06-10 13:50:14.666639] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:00.481 [2024-06-10 13:50:14.732402] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:22:00.481 [2024-06-10 13:50:14.774624] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:00.481 [2024-06-10 13:50:14.774649] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:01.052 13:50:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:01.052 13:50:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:22:01.052 13:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:01.052 13:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:01.312 BaseBdev1_malloc 00:22:01.312 13:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:01.572 [2024-06-10 13:50:15.813839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:01.572 [2024-06-10 13:50:15.813874] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.572 [2024-06-10 13:50:15.813888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13dd900 00:22:01.572 [2024-06-10 13:50:15.813895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.572 [2024-06-10 13:50:15.815297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.572 [2024-06-10 13:50:15.815317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:01.572 BaseBdev1 00:22:01.572 13:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:01.572 13:50:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:01.572 BaseBdev2_malloc 00:22:01.572 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:01.832 [2024-06-10 13:50:16.217094] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:01.832 [2024-06-10 13:50:16.217124] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.832 [2024-06-10 13:50:16.217135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13de9c0 00:22:01.832 [2024-06-10 13:50:16.217141] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.832 [2024-06-10 13:50:16.218406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.832 [2024-06-10 13:50:16.218425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:01.832 BaseBdev2 00:22:01.832 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:01.832 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:02.093 BaseBdev3_malloc 00:22:02.093 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:02.354 [2024-06-10 13:50:16.624169] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:02.354 [2024-06-10 13:50:16.624198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.354 [2024-06-10 13:50:16.624209] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1588e80 00:22:02.354 [2024-06-10 13:50:16.624216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.354 [2024-06-10 13:50:16.625480] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.354 [2024-06-10 13:50:16.625500] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:02.354 BaseBdev3 00:22:02.354 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:02.354 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:02.615 BaseBdev4_malloc 00:22:02.615 13:50:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:02.615 [2024-06-10 13:50:17.031273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:02.615 [2024-06-10 13:50:17.031300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.615 [2024-06-10 13:50:17.031311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x158bb20 00:22:02.615 [2024-06-10 13:50:17.031318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.615 [2024-06-10 13:50:17.032591] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.615 [2024-06-10 13:50:17.032609] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:02.615 BaseBdev4 00:22:02.615 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:02.875 spare_malloc 00:22:02.875 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:03.135 spare_delay 00:22:03.135 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:03.395 [2024-06-10 13:50:17.634817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:03.395 [2024-06-10 13:50:17.634847] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.395 [2024-06-10 13:50:17.634861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x158e730 00:22:03.395 [2024-06-10 13:50:17.634867] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.395 [2024-06-10 13:50:17.636137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.395 [2024-06-10 13:50:17.636157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:03.395 spare 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:03.395 [2024-06-10 13:50:17.823313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:03.395 [2024-06-10 13:50:17.824377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:03.395 [2024-06-10 13:50:17.824420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:03.395 [2024-06-10 13:50:17.824461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:03.395 [2024-06-10 13:50:17.824525] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x158d330 00:22:03.395 [2024-06-10 13:50:17.824531] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:03.395 [2024-06-10 13:50:17.824695] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x158dc70 00:22:03.395 [2024-06-10 13:50:17.824811] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x158d330 00:22:03.395 [2024-06-10 13:50:17.824817] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x158d330 00:22:03.395 [2024-06-10 13:50:17.824904] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.395 13:50:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.656 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.656 "name": "raid_bdev1", 00:22:03.656 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:03.656 "strip_size_kb": 0, 00:22:03.656 "state": "online", 00:22:03.656 "raid_level": "raid1", 00:22:03.656 "superblock": false, 00:22:03.656 "num_base_bdevs": 4, 00:22:03.656 "num_base_bdevs_discovered": 4, 00:22:03.656 "num_base_bdevs_operational": 4, 00:22:03.657 "base_bdevs_list": [ 00:22:03.657 { 00:22:03.657 "name": "BaseBdev1", 00:22:03.657 "uuid": "27db1c75-d2ea-5ac3-adae-cec6af52b6fa", 00:22:03.657 "is_configured": true, 00:22:03.657 "data_offset": 0, 00:22:03.657 "data_size": 65536 00:22:03.657 }, 00:22:03.657 { 00:22:03.657 "name": "BaseBdev2", 00:22:03.657 "uuid": "6c8c02d2-c96e-5413-ba55-e06cef04a149", 00:22:03.657 "is_configured": true, 00:22:03.657 "data_offset": 0, 00:22:03.657 "data_size": 65536 00:22:03.657 }, 00:22:03.657 { 00:22:03.657 "name": "BaseBdev3", 00:22:03.657 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:03.657 "is_configured": true, 00:22:03.657 "data_offset": 0, 00:22:03.657 "data_size": 65536 00:22:03.657 }, 00:22:03.657 { 00:22:03.657 "name": "BaseBdev4", 00:22:03.657 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:03.657 "is_configured": true, 00:22:03.657 "data_offset": 0, 00:22:03.657 "data_size": 65536 00:22:03.657 } 00:22:03.657 ] 00:22:03.657 }' 00:22:03.657 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.657 13:50:18 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:04.228 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:04.228 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:04.488 [2024-06-10 13:50:18.777955] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:04.488 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:04.488 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.488 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:04.749 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:04.749 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:04.749 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:04.749 13:50:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:04.749 [2024-06-10 13:50:19.091939] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1512eb0 00:22:04.749 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:04.749 Zero copy mechanism will not be used. 00:22:04.749 Running I/O for 60 seconds... 00:22:04.749 [2024-06-10 13:50:19.184347] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:04.749 [2024-06-10 13:50:19.184499] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1512eb0 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.749 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:05.009 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.009 "name": "raid_bdev1", 00:22:05.009 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:05.009 "strip_size_kb": 0, 00:22:05.009 "state": "online", 00:22:05.009 "raid_level": "raid1", 00:22:05.009 "superblock": false, 00:22:05.009 "num_base_bdevs": 4, 00:22:05.009 "num_base_bdevs_discovered": 3, 00:22:05.009 "num_base_bdevs_operational": 3, 00:22:05.009 "base_bdevs_list": [ 00:22:05.009 { 00:22:05.009 "name": null, 00:22:05.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.009 "is_configured": false, 00:22:05.010 "data_offset": 0, 00:22:05.010 "data_size": 65536 00:22:05.010 }, 00:22:05.010 { 00:22:05.010 "name": "BaseBdev2", 00:22:05.010 "uuid": "6c8c02d2-c96e-5413-ba55-e06cef04a149", 00:22:05.010 "is_configured": true, 00:22:05.010 "data_offset": 0, 00:22:05.010 "data_size": 65536 00:22:05.010 }, 00:22:05.010 { 00:22:05.010 "name": "BaseBdev3", 00:22:05.010 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:05.010 "is_configured": true, 00:22:05.010 "data_offset": 0, 00:22:05.010 "data_size": 65536 00:22:05.010 }, 00:22:05.010 { 00:22:05.010 "name": "BaseBdev4", 00:22:05.010 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:05.010 "is_configured": true, 00:22:05.010 "data_offset": 0, 00:22:05.010 "data_size": 65536 00:22:05.010 } 00:22:05.010 ] 00:22:05.010 }' 00:22:05.010 13:50:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.010 13:50:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:05.581 13:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:05.842 [2024-06-10 13:50:20.192566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:05.842 13:50:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:05.842 [2024-06-10 13:50:20.252513] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15115c0 00:22:05.842 [2024-06-10 13:50:20.254445] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:06.103 [2024-06-10 13:50:20.371699] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:06.103 [2024-06-10 13:50:20.372414] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:06.363 [2024-06-10 13:50:20.600002] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:06.363 [2024-06-10 13:50:20.600363] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:06.622 [2024-06-10 13:50:20.955238] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:06.623 [2024-06-10 13:50:20.956049] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:06.882 [2024-06-10 13:50:21.174558] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.882 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.142 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:07.142 "name": "raid_bdev1", 00:22:07.142 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:07.142 "strip_size_kb": 0, 00:22:07.142 "state": "online", 00:22:07.142 "raid_level": "raid1", 00:22:07.142 "superblock": false, 00:22:07.142 "num_base_bdevs": 4, 00:22:07.142 "num_base_bdevs_discovered": 4, 00:22:07.142 "num_base_bdevs_operational": 4, 00:22:07.142 "process": { 00:22:07.142 "type": "rebuild", 00:22:07.142 "target": "spare", 00:22:07.142 "progress": { 00:22:07.142 "blocks": 12288, 00:22:07.142 "percent": 18 00:22:07.142 } 00:22:07.142 }, 00:22:07.142 "base_bdevs_list": [ 00:22:07.142 { 00:22:07.142 "name": "spare", 00:22:07.142 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:07.142 "is_configured": true, 00:22:07.142 "data_offset": 0, 00:22:07.142 "data_size": 65536 00:22:07.142 }, 00:22:07.142 { 00:22:07.142 "name": "BaseBdev2", 00:22:07.142 "uuid": "6c8c02d2-c96e-5413-ba55-e06cef04a149", 00:22:07.142 "is_configured": true, 00:22:07.142 "data_offset": 0, 00:22:07.142 "data_size": 65536 00:22:07.142 }, 00:22:07.142 { 00:22:07.142 "name": "BaseBdev3", 00:22:07.142 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:07.142 "is_configured": true, 00:22:07.142 "data_offset": 0, 00:22:07.142 "data_size": 65536 00:22:07.142 }, 00:22:07.142 { 00:22:07.142 "name": "BaseBdev4", 00:22:07.142 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:07.142 "is_configured": true, 00:22:07.142 "data_offset": 0, 00:22:07.142 "data_size": 65536 00:22:07.142 } 00:22:07.142 ] 00:22:07.142 }' 00:22:07.142 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:07.142 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:07.142 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:07.142 [2024-06-10 13:50:21.512578] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:07.142 [2024-06-10 13:50:21.519671] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:07.143 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:07.143 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:07.403 [2024-06-10 13:50:21.735396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:07.403 [2024-06-10 13:50:21.738607] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:07.403 [2024-06-10 13:50:21.840460] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:07.403 [2024-06-10 13:50:21.857228] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.403 [2024-06-10 13:50:21.857249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:07.403 [2024-06-10 13:50:21.857255] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:07.403 [2024-06-10 13:50:21.875304] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1512eb0 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.663 13:50:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.663 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.663 "name": "raid_bdev1", 00:22:07.663 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:07.663 "strip_size_kb": 0, 00:22:07.663 "state": "online", 00:22:07.663 "raid_level": "raid1", 00:22:07.663 "superblock": false, 00:22:07.663 "num_base_bdevs": 4, 00:22:07.663 "num_base_bdevs_discovered": 3, 00:22:07.663 "num_base_bdevs_operational": 3, 00:22:07.663 "base_bdevs_list": [ 00:22:07.663 { 00:22:07.663 "name": null, 00:22:07.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.663 "is_configured": false, 00:22:07.663 "data_offset": 0, 00:22:07.663 "data_size": 65536 00:22:07.663 }, 00:22:07.663 { 00:22:07.663 "name": "BaseBdev2", 00:22:07.663 "uuid": "6c8c02d2-c96e-5413-ba55-e06cef04a149", 00:22:07.663 "is_configured": true, 00:22:07.663 "data_offset": 0, 00:22:07.664 "data_size": 65536 00:22:07.664 }, 00:22:07.664 { 00:22:07.664 "name": "BaseBdev3", 00:22:07.664 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:07.664 "is_configured": true, 00:22:07.664 "data_offset": 0, 00:22:07.664 "data_size": 65536 00:22:07.664 }, 00:22:07.664 { 00:22:07.664 "name": "BaseBdev4", 00:22:07.664 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:07.664 "is_configured": true, 00:22:07.664 "data_offset": 0, 00:22:07.664 "data_size": 65536 00:22:07.664 } 00:22:07.664 ] 00:22:07.664 }' 00:22:07.664 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.664 13:50:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.234 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.494 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:08.494 "name": "raid_bdev1", 00:22:08.494 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:08.494 "strip_size_kb": 0, 00:22:08.494 "state": "online", 00:22:08.494 "raid_level": "raid1", 00:22:08.494 "superblock": false, 00:22:08.494 "num_base_bdevs": 4, 00:22:08.494 "num_base_bdevs_discovered": 3, 00:22:08.494 "num_base_bdevs_operational": 3, 00:22:08.494 "base_bdevs_list": [ 00:22:08.494 { 00:22:08.494 "name": null, 00:22:08.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.494 "is_configured": false, 00:22:08.494 "data_offset": 0, 00:22:08.494 "data_size": 65536 00:22:08.494 }, 00:22:08.494 { 00:22:08.494 "name": "BaseBdev2", 00:22:08.494 "uuid": "6c8c02d2-c96e-5413-ba55-e06cef04a149", 00:22:08.494 "is_configured": true, 00:22:08.494 "data_offset": 0, 00:22:08.494 "data_size": 65536 00:22:08.494 }, 00:22:08.494 { 00:22:08.494 "name": "BaseBdev3", 00:22:08.494 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:08.494 "is_configured": true, 00:22:08.494 "data_offset": 0, 00:22:08.494 "data_size": 65536 00:22:08.494 }, 00:22:08.494 { 00:22:08.494 "name": "BaseBdev4", 00:22:08.494 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:08.494 "is_configured": true, 00:22:08.494 "data_offset": 0, 00:22:08.494 "data_size": 65536 00:22:08.494 } 00:22:08.494 ] 00:22:08.494 }' 00:22:08.495 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:08.495 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:08.495 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:08.754 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:08.754 13:50:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:08.754 [2024-06-10 13:50:23.196996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:09.014 [2024-06-10 13:50:23.234851] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15840f0 00:22:09.014 [2024-06-10 13:50:23.236105] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:09.014 13:50:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:09.015 [2024-06-10 13:50:23.369537] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:09.275 [2024-06-10 13:50:23.509536] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:09.275 [2024-06-10 13:50:23.509955] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:09.536 [2024-06-10 13:50:23.854945] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:09.536 [2024-06-10 13:50:23.855155] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:09.797 [2024-06-10 13:50:24.081782] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:09.797 [2024-06-10 13:50:24.082172] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.797 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.058 [2024-06-10 13:50:24.403682] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:10.058 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:10.058 "name": "raid_bdev1", 00:22:10.058 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:10.058 "strip_size_kb": 0, 00:22:10.058 "state": "online", 00:22:10.058 "raid_level": "raid1", 00:22:10.058 "superblock": false, 00:22:10.058 "num_base_bdevs": 4, 00:22:10.058 "num_base_bdevs_discovered": 4, 00:22:10.058 "num_base_bdevs_operational": 4, 00:22:10.058 "process": { 00:22:10.058 "type": "rebuild", 00:22:10.058 "target": "spare", 00:22:10.058 "progress": { 00:22:10.058 "blocks": 14336, 00:22:10.058 "percent": 21 00:22:10.058 } 00:22:10.058 }, 00:22:10.058 "base_bdevs_list": [ 00:22:10.058 { 00:22:10.058 "name": "spare", 00:22:10.058 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:10.058 "is_configured": true, 00:22:10.058 "data_offset": 0, 00:22:10.058 "data_size": 65536 00:22:10.058 }, 00:22:10.058 { 00:22:10.058 "name": "BaseBdev2", 00:22:10.058 "uuid": "6c8c02d2-c96e-5413-ba55-e06cef04a149", 00:22:10.058 "is_configured": true, 00:22:10.058 "data_offset": 0, 00:22:10.058 "data_size": 65536 00:22:10.058 }, 00:22:10.058 { 00:22:10.058 "name": "BaseBdev3", 00:22:10.058 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:10.058 "is_configured": true, 00:22:10.058 "data_offset": 0, 00:22:10.058 "data_size": 65536 00:22:10.058 }, 00:22:10.058 { 00:22:10.058 "name": "BaseBdev4", 00:22:10.058 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:10.058 "is_configured": true, 00:22:10.058 "data_offset": 0, 00:22:10.058 "data_size": 65536 00:22:10.058 } 00:22:10.058 ] 00:22:10.058 }' 00:22:10.058 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:10.058 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:10.058 [2024-06-10 13:50:24.505110] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:10.058 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:10.319 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:10.319 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:10.319 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:10.319 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:10.319 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:10.319 13:50:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:10.319 [2024-06-10 13:50:24.738099] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:10.319 [2024-06-10 13:50:24.739568] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:10.580 [2024-06-10 13:50:24.964106] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:10.580 [2024-06-10 13:50:24.964515] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:10.841 [2024-06-10 13:50:25.073958] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1512eb0 00:22:10.841 [2024-06-10 13:50:25.073977] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x15840f0 00:22:10.841 [2024-06-10 13:50:25.074011] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.841 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.102 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.102 "name": "raid_bdev1", 00:22:11.102 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:11.102 "strip_size_kb": 0, 00:22:11.102 "state": "online", 00:22:11.102 "raid_level": "raid1", 00:22:11.102 "superblock": false, 00:22:11.102 "num_base_bdevs": 4, 00:22:11.102 "num_base_bdevs_discovered": 3, 00:22:11.102 "num_base_bdevs_operational": 3, 00:22:11.102 "process": { 00:22:11.102 "type": "rebuild", 00:22:11.102 "target": "spare", 00:22:11.102 "progress": { 00:22:11.102 "blocks": 22528, 00:22:11.102 "percent": 34 00:22:11.102 } 00:22:11.102 }, 00:22:11.102 "base_bdevs_list": [ 00:22:11.102 { 00:22:11.102 "name": "spare", 00:22:11.102 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:11.102 "is_configured": true, 00:22:11.102 "data_offset": 0, 00:22:11.102 "data_size": 65536 00:22:11.102 }, 00:22:11.102 { 00:22:11.102 "name": null, 00:22:11.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.102 "is_configured": false, 00:22:11.103 "data_offset": 0, 00:22:11.103 "data_size": 65536 00:22:11.103 }, 00:22:11.103 { 00:22:11.103 "name": "BaseBdev3", 00:22:11.103 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:11.103 "is_configured": true, 00:22:11.103 "data_offset": 0, 00:22:11.103 "data_size": 65536 00:22:11.103 }, 00:22:11.103 { 00:22:11.103 "name": "BaseBdev4", 00:22:11.103 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:11.103 "is_configured": true, 00:22:11.103 "data_offset": 0, 00:22:11.103 "data_size": 65536 00:22:11.103 } 00:22:11.103 ] 00:22:11.103 }' 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=813 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.103 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.103 [2024-06-10 13:50:25.426759] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:22:11.364 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:11.364 "name": "raid_bdev1", 00:22:11.364 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:11.364 "strip_size_kb": 0, 00:22:11.364 "state": "online", 00:22:11.364 "raid_level": "raid1", 00:22:11.364 "superblock": false, 00:22:11.364 "num_base_bdevs": 4, 00:22:11.364 "num_base_bdevs_discovered": 3, 00:22:11.364 "num_base_bdevs_operational": 3, 00:22:11.364 "process": { 00:22:11.364 "type": "rebuild", 00:22:11.364 "target": "spare", 00:22:11.364 "progress": { 00:22:11.364 "blocks": 26624, 00:22:11.364 "percent": 40 00:22:11.364 } 00:22:11.364 }, 00:22:11.364 "base_bdevs_list": [ 00:22:11.364 { 00:22:11.364 "name": "spare", 00:22:11.364 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:11.364 "is_configured": true, 00:22:11.364 "data_offset": 0, 00:22:11.364 "data_size": 65536 00:22:11.364 }, 00:22:11.364 { 00:22:11.364 "name": null, 00:22:11.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.364 "is_configured": false, 00:22:11.364 "data_offset": 0, 00:22:11.364 "data_size": 65536 00:22:11.364 }, 00:22:11.364 { 00:22:11.364 "name": "BaseBdev3", 00:22:11.364 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:11.364 "is_configured": true, 00:22:11.364 "data_offset": 0, 00:22:11.364 "data_size": 65536 00:22:11.364 }, 00:22:11.364 { 00:22:11.364 "name": "BaseBdev4", 00:22:11.364 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:11.364 "is_configured": true, 00:22:11.364 "data_offset": 0, 00:22:11.364 "data_size": 65536 00:22:11.364 } 00:22:11.364 ] 00:22:11.364 }' 00:22:11.364 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:11.364 [2024-06-10 13:50:25.649984] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:11.364 [2024-06-10 13:50:25.650128] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:11.364 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:11.364 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:11.364 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.364 13:50:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:11.625 [2024-06-10 13:50:26.093706] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.566 [2024-06-10 13:50:26.777436] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.566 "name": "raid_bdev1", 00:22:12.566 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:12.566 "strip_size_kb": 0, 00:22:12.566 "state": "online", 00:22:12.566 "raid_level": "raid1", 00:22:12.566 "superblock": false, 00:22:12.566 "num_base_bdevs": 4, 00:22:12.566 "num_base_bdevs_discovered": 3, 00:22:12.566 "num_base_bdevs_operational": 3, 00:22:12.566 "process": { 00:22:12.566 "type": "rebuild", 00:22:12.566 "target": "spare", 00:22:12.566 "progress": { 00:22:12.566 "blocks": 47104, 00:22:12.566 "percent": 71 00:22:12.566 } 00:22:12.566 }, 00:22:12.566 "base_bdevs_list": [ 00:22:12.566 { 00:22:12.566 "name": "spare", 00:22:12.566 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:12.566 "is_configured": true, 00:22:12.566 "data_offset": 0, 00:22:12.566 "data_size": 65536 00:22:12.566 }, 00:22:12.566 { 00:22:12.566 "name": null, 00:22:12.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.566 "is_configured": false, 00:22:12.566 "data_offset": 0, 00:22:12.566 "data_size": 65536 00:22:12.566 }, 00:22:12.566 { 00:22:12.566 "name": "BaseBdev3", 00:22:12.566 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:12.566 "is_configured": true, 00:22:12.566 "data_offset": 0, 00:22:12.566 "data_size": 65536 00:22:12.566 }, 00:22:12.566 { 00:22:12.566 "name": "BaseBdev4", 00:22:12.566 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:12.566 "is_configured": true, 00:22:12.566 "data_offset": 0, 00:22:12.566 "data_size": 65536 00:22:12.566 } 00:22:12.566 ] 00:22:12.566 }' 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:12.566 13:50:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.566 13:50:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.566 13:50:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:12.826 [2024-06-10 13:50:27.118370] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:22:13.086 [2024-06-10 13:50:27.344447] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:13.347 [2024-06-10 13:50:27.668702] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:13.347 [2024-06-10 13:50:27.668950] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.607 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.907 [2024-06-10 13:50:28.214199] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:13.907 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:13.907 "name": "raid_bdev1", 00:22:13.907 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:13.907 "strip_size_kb": 0, 00:22:13.907 "state": "online", 00:22:13.907 "raid_level": "raid1", 00:22:13.907 "superblock": false, 00:22:13.907 "num_base_bdevs": 4, 00:22:13.907 "num_base_bdevs_discovered": 3, 00:22:13.907 "num_base_bdevs_operational": 3, 00:22:13.907 "process": { 00:22:13.907 "type": "rebuild", 00:22:13.907 "target": "spare", 00:22:13.907 "progress": { 00:22:13.907 "blocks": 63488, 00:22:13.907 "percent": 96 00:22:13.907 } 00:22:13.907 }, 00:22:13.907 "base_bdevs_list": [ 00:22:13.907 { 00:22:13.907 "name": "spare", 00:22:13.907 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:13.907 "is_configured": true, 00:22:13.907 "data_offset": 0, 00:22:13.907 "data_size": 65536 00:22:13.907 }, 00:22:13.907 { 00:22:13.907 "name": null, 00:22:13.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.907 "is_configured": false, 00:22:13.907 "data_offset": 0, 00:22:13.907 "data_size": 65536 00:22:13.907 }, 00:22:13.907 { 00:22:13.907 "name": "BaseBdev3", 00:22:13.907 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:13.907 "is_configured": true, 00:22:13.907 "data_offset": 0, 00:22:13.907 "data_size": 65536 00:22:13.907 }, 00:22:13.907 { 00:22:13.907 "name": "BaseBdev4", 00:22:13.907 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:13.907 "is_configured": true, 00:22:13.907 "data_offset": 0, 00:22:13.907 "data_size": 65536 00:22:13.907 } 00:22:13.907 ] 00:22:13.907 }' 00:22:13.907 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:13.907 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:13.907 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:13.907 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:13.907 13:50:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:13.907 [2024-06-10 13:50:28.321189] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:13.907 [2024-06-10 13:50:28.322811] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.901 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.161 "name": "raid_bdev1", 00:22:15.161 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:15.161 "strip_size_kb": 0, 00:22:15.161 "state": "online", 00:22:15.161 "raid_level": "raid1", 00:22:15.161 "superblock": false, 00:22:15.161 "num_base_bdevs": 4, 00:22:15.161 "num_base_bdevs_discovered": 3, 00:22:15.161 "num_base_bdevs_operational": 3, 00:22:15.161 "base_bdevs_list": [ 00:22:15.161 { 00:22:15.161 "name": "spare", 00:22:15.161 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:15.161 "is_configured": true, 00:22:15.161 "data_offset": 0, 00:22:15.161 "data_size": 65536 00:22:15.161 }, 00:22:15.161 { 00:22:15.161 "name": null, 00:22:15.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.161 "is_configured": false, 00:22:15.161 "data_offset": 0, 00:22:15.161 "data_size": 65536 00:22:15.161 }, 00:22:15.161 { 00:22:15.161 "name": "BaseBdev3", 00:22:15.161 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:15.161 "is_configured": true, 00:22:15.161 "data_offset": 0, 00:22:15.161 "data_size": 65536 00:22:15.161 }, 00:22:15.161 { 00:22:15.161 "name": "BaseBdev4", 00:22:15.161 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:15.161 "is_configured": true, 00:22:15.161 "data_offset": 0, 00:22:15.161 "data_size": 65536 00:22:15.161 } 00:22:15.161 ] 00:22:15.161 }' 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.161 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.422 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.422 "name": "raid_bdev1", 00:22:15.422 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:15.422 "strip_size_kb": 0, 00:22:15.422 "state": "online", 00:22:15.422 "raid_level": "raid1", 00:22:15.422 "superblock": false, 00:22:15.422 "num_base_bdevs": 4, 00:22:15.422 "num_base_bdevs_discovered": 3, 00:22:15.422 "num_base_bdevs_operational": 3, 00:22:15.422 "base_bdevs_list": [ 00:22:15.422 { 00:22:15.422 "name": "spare", 00:22:15.422 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:15.422 "is_configured": true, 00:22:15.422 "data_offset": 0, 00:22:15.422 "data_size": 65536 00:22:15.422 }, 00:22:15.422 { 00:22:15.422 "name": null, 00:22:15.422 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.422 "is_configured": false, 00:22:15.422 "data_offset": 0, 00:22:15.422 "data_size": 65536 00:22:15.422 }, 00:22:15.422 { 00:22:15.422 "name": "BaseBdev3", 00:22:15.422 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:15.422 "is_configured": true, 00:22:15.422 "data_offset": 0, 00:22:15.422 "data_size": 65536 00:22:15.422 }, 00:22:15.422 { 00:22:15.422 "name": "BaseBdev4", 00:22:15.422 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:15.422 "is_configured": true, 00:22:15.422 "data_offset": 0, 00:22:15.422 "data_size": 65536 00:22:15.422 } 00:22:15.422 ] 00:22:15.422 }' 00:22:15.422 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.422 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:15.422 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.682 13:50:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.682 13:50:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.682 "name": "raid_bdev1", 00:22:15.682 "uuid": "2cee3014-70c9-491a-a22b-bffe6410b45a", 00:22:15.682 "strip_size_kb": 0, 00:22:15.682 "state": "online", 00:22:15.682 "raid_level": "raid1", 00:22:15.682 "superblock": false, 00:22:15.682 "num_base_bdevs": 4, 00:22:15.682 "num_base_bdevs_discovered": 3, 00:22:15.682 "num_base_bdevs_operational": 3, 00:22:15.682 "base_bdevs_list": [ 00:22:15.682 { 00:22:15.682 "name": "spare", 00:22:15.682 "uuid": "a50789c2-bc57-5a43-942d-e3e77db0bb65", 00:22:15.682 "is_configured": true, 00:22:15.682 "data_offset": 0, 00:22:15.682 "data_size": 65536 00:22:15.682 }, 00:22:15.682 { 00:22:15.682 "name": null, 00:22:15.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.682 "is_configured": false, 00:22:15.682 "data_offset": 0, 00:22:15.682 "data_size": 65536 00:22:15.682 }, 00:22:15.682 { 00:22:15.682 "name": "BaseBdev3", 00:22:15.682 "uuid": "69b83b16-6306-5835-b719-4da89964d7bf", 00:22:15.682 "is_configured": true, 00:22:15.682 "data_offset": 0, 00:22:15.682 "data_size": 65536 00:22:15.682 }, 00:22:15.682 { 00:22:15.682 "name": "BaseBdev4", 00:22:15.682 "uuid": "ce225dc7-152a-5e23-9b29-7343dd26c0f5", 00:22:15.682 "is_configured": true, 00:22:15.682 "data_offset": 0, 00:22:15.682 "data_size": 65536 00:22:15.682 } 00:22:15.682 ] 00:22:15.682 }' 00:22:15.682 13:50:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.682 13:50:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:16.253 13:50:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:16.513 [2024-06-10 13:50:30.856020] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:16.513 [2024-06-10 13:50:30.856041] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:16.513 00:22:16.513 Latency(us) 00:22:16.513 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:16.514 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:16.514 raid_bdev1 : 11.81 94.54 283.63 0.00 0.00 14782.70 261.12 116217.17 00:22:16.514 =================================================================================================================== 00:22:16.514 Total : 94.54 283.63 0.00 0.00 14782.70 261.12 116217.17 00:22:16.514 [2024-06-10 13:50:30.939670] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:16.514 [2024-06-10 13:50:30.939695] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:16.514 [2024-06-10 13:50:30.939772] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:16.514 [2024-06-10 13:50:30.939778] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x158d330 name raid_bdev1, state offline 00:22:16.514 0 00:22:16.514 13:50:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.514 13:50:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:16.775 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:17.036 /dev/nbd0 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:17.036 1+0 records in 00:22:17.036 1+0 records out 00:22:17.036 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244568 s, 16.7 MB/s 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:17.036 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:22:17.297 /dev/nbd1 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:17.297 1+0 records in 00:22:17.297 1+0 records out 00:22:17.297 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271163 s, 15.1 MB/s 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:17.297 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:22:17.558 /dev/nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:17.558 13:50:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:17.558 1+0 records in 00:22:17.558 1+0 records out 00:22:17.558 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218727 s, 18.7 MB/s 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:17.558 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:17.818 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1652791 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 1652791 ']' 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 1652791 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:18.078 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1652791 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1652791' 00:22:18.338 killing process with pid 1652791 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 1652791 00:22:18.338 Received shutdown signal, test time was about 13.431546 seconds 00:22:18.338 00:22:18.338 Latency(us) 00:22:18.338 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:18.338 =================================================================================================================== 00:22:18.338 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:18.338 [2024-06-10 13:50:32.556815] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 1652791 00:22:18.338 [2024-06-10 13:50:32.580453] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:22:18.338 00:22:18.338 real 0m18.190s 00:22:18.338 user 0m28.031s 00:22:18.338 sys 0m2.329s 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:18.338 ************************************ 00:22:18.338 END TEST raid_rebuild_test_io 00:22:18.338 ************************************ 00:22:18.338 13:50:32 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:22:18.338 13:50:32 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:18.338 13:50:32 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:18.338 13:50:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:18.338 ************************************ 00:22:18.338 START TEST raid_rebuild_test_sb_io 00:22:18.338 ************************************ 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true true true 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:18.338 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1656671 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1656671 /var/tmp/spdk-raid.sock 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 1656671 ']' 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:18.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:18.339 13:50:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:18.599 [2024-06-10 13:50:32.859853] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:22:18.599 [2024-06-10 13:50:32.859901] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656671 ] 00:22:18.599 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:18.599 Zero copy mechanism will not be used. 00:22:18.599 [2024-06-10 13:50:32.948055] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:18.599 [2024-06-10 13:50:33.015400] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:22:18.599 [2024-06-10 13:50:33.055445] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:18.599 [2024-06-10 13:50:33.055469] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:19.540 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:19.540 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:22:19.540 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:19.540 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:19.540 BaseBdev1_malloc 00:22:19.540 13:50:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:19.800 [2024-06-10 13:50:34.094759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:19.800 [2024-06-10 13:50:34.094793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.800 [2024-06-10 13:50:34.094809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2224900 00:22:19.800 [2024-06-10 13:50:34.094816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.800 [2024-06-10 13:50:34.096212] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.800 [2024-06-10 13:50:34.096233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:19.800 BaseBdev1 00:22:19.800 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:19.800 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:20.060 BaseBdev2_malloc 00:22:20.060 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:20.060 [2024-06-10 13:50:34.497987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:20.060 [2024-06-10 13:50:34.498015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.060 [2024-06-10 13:50:34.498027] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22259c0 00:22:20.060 [2024-06-10 13:50:34.498033] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.060 [2024-06-10 13:50:34.499299] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.060 [2024-06-10 13:50:34.499318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:20.060 BaseBdev2 00:22:20.060 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:20.060 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:20.319 BaseBdev3_malloc 00:22:20.319 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:22:20.580 [2024-06-10 13:50:34.901114] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:22:20.580 [2024-06-10 13:50:34.901144] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.580 [2024-06-10 13:50:34.901157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cfe80 00:22:20.580 [2024-06-10 13:50:34.901168] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.580 [2024-06-10 13:50:34.902432] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.580 [2024-06-10 13:50:34.902452] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:20.580 BaseBdev3 00:22:20.580 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:20.580 13:50:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:20.840 BaseBdev4_malloc 00:22:20.840 13:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:22:20.840 [2024-06-10 13:50:35.304166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:22:20.840 [2024-06-10 13:50:35.304195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.840 [2024-06-10 13:50:35.304206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d2b20 00:22:20.840 [2024-06-10 13:50:35.304213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.840 [2024-06-10 13:50:35.305481] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.840 [2024-06-10 13:50:35.305499] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:20.840 BaseBdev4 00:22:21.100 13:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:21.100 spare_malloc 00:22:21.100 13:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:21.360 spare_delay 00:22:21.360 13:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:21.619 [2024-06-10 13:50:35.907696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:21.619 [2024-06-10 13:50:35.907724] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.619 [2024-06-10 13:50:35.907737] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d5730 00:22:21.619 [2024-06-10 13:50:35.907744] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.619 [2024-06-10 13:50:35.909019] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.619 [2024-06-10 13:50:35.909039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:21.619 spare 00:22:21.619 13:50:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:22:21.879 [2024-06-10 13:50:36.108230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:21.879 [2024-06-10 13:50:36.109330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:21.879 [2024-06-10 13:50:36.109377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:21.879 [2024-06-10 13:50:36.109415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:21.879 [2024-06-10 13:50:36.109569] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23d4330 00:22:21.879 [2024-06-10 13:50:36.109576] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:21.879 [2024-06-10 13:50:36.109734] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d4c70 00:22:21.879 [2024-06-10 13:50:36.109854] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23d4330 00:22:21.879 [2024-06-10 13:50:36.109860] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23d4330 00:22:21.879 [2024-06-10 13:50:36.109931] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.879 "name": "raid_bdev1", 00:22:21.879 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:21.879 "strip_size_kb": 0, 00:22:21.879 "state": "online", 00:22:21.879 "raid_level": "raid1", 00:22:21.879 "superblock": true, 00:22:21.879 "num_base_bdevs": 4, 00:22:21.879 "num_base_bdevs_discovered": 4, 00:22:21.879 "num_base_bdevs_operational": 4, 00:22:21.879 "base_bdevs_list": [ 00:22:21.879 { 00:22:21.879 "name": "BaseBdev1", 00:22:21.879 "uuid": "4fbc1690-a6ad-5b94-948b-15a060258e51", 00:22:21.879 "is_configured": true, 00:22:21.879 "data_offset": 2048, 00:22:21.879 "data_size": 63488 00:22:21.879 }, 00:22:21.879 { 00:22:21.879 "name": "BaseBdev2", 00:22:21.879 "uuid": "c7f2ea0e-0815-5f44-8097-ec8e36a36523", 00:22:21.879 "is_configured": true, 00:22:21.879 "data_offset": 2048, 00:22:21.879 "data_size": 63488 00:22:21.879 }, 00:22:21.879 { 00:22:21.879 "name": "BaseBdev3", 00:22:21.879 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:21.879 "is_configured": true, 00:22:21.879 "data_offset": 2048, 00:22:21.879 "data_size": 63488 00:22:21.879 }, 00:22:21.879 { 00:22:21.879 "name": "BaseBdev4", 00:22:21.879 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:21.879 "is_configured": true, 00:22:21.879 "data_offset": 2048, 00:22:21.879 "data_size": 63488 00:22:21.879 } 00:22:21.879 ] 00:22:21.879 }' 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.879 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:22.449 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:22.449 13:50:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:22.709 [2024-06-10 13:50:37.094952] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:22.709 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:22.709 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.709 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:22.969 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:22.969 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:22.969 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:22.969 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:22.969 [2024-06-10 13:50:37.416990] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x235b9e0 00:22:22.969 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:22.969 Zero copy mechanism will not be used. 00:22:22.969 Running I/O for 60 seconds... 00:22:23.229 [2024-06-10 13:50:37.513258] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:23.229 [2024-06-10 13:50:37.520262] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x235b9e0 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.229 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.489 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.489 "name": "raid_bdev1", 00:22:23.489 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:23.489 "strip_size_kb": 0, 00:22:23.489 "state": "online", 00:22:23.489 "raid_level": "raid1", 00:22:23.489 "superblock": true, 00:22:23.489 "num_base_bdevs": 4, 00:22:23.489 "num_base_bdevs_discovered": 3, 00:22:23.489 "num_base_bdevs_operational": 3, 00:22:23.489 "base_bdevs_list": [ 00:22:23.489 { 00:22:23.489 "name": null, 00:22:23.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.489 "is_configured": false, 00:22:23.489 "data_offset": 2048, 00:22:23.489 "data_size": 63488 00:22:23.489 }, 00:22:23.489 { 00:22:23.489 "name": "BaseBdev2", 00:22:23.489 "uuid": "c7f2ea0e-0815-5f44-8097-ec8e36a36523", 00:22:23.489 "is_configured": true, 00:22:23.489 "data_offset": 2048, 00:22:23.489 "data_size": 63488 00:22:23.489 }, 00:22:23.489 { 00:22:23.489 "name": "BaseBdev3", 00:22:23.489 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:23.489 "is_configured": true, 00:22:23.489 "data_offset": 2048, 00:22:23.489 "data_size": 63488 00:22:23.489 }, 00:22:23.489 { 00:22:23.489 "name": "BaseBdev4", 00:22:23.489 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:23.489 "is_configured": true, 00:22:23.489 "data_offset": 2048, 00:22:23.489 "data_size": 63488 00:22:23.489 } 00:22:23.489 ] 00:22:23.489 }' 00:22:23.489 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.489 13:50:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:24.059 13:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:24.059 [2024-06-10 13:50:38.525395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:24.320 13:50:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:24.320 [2024-06-10 13:50:38.577791] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x235a230 00:22:24.320 [2024-06-10 13:50:38.579651] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:24.320 [2024-06-10 13:50:38.704367] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:24.320 [2024-06-10 13:50:38.705117] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:24.580 [2024-06-10 13:50:38.940274] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:24.580 [2024-06-10 13:50:38.940393] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.151 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.411 [2024-06-10 13:50:39.670531] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:25.411 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.411 "name": "raid_bdev1", 00:22:25.411 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:25.411 "strip_size_kb": 0, 00:22:25.411 "state": "online", 00:22:25.411 "raid_level": "raid1", 00:22:25.411 "superblock": true, 00:22:25.411 "num_base_bdevs": 4, 00:22:25.411 "num_base_bdevs_discovered": 4, 00:22:25.411 "num_base_bdevs_operational": 4, 00:22:25.411 "process": { 00:22:25.411 "type": "rebuild", 00:22:25.411 "target": "spare", 00:22:25.411 "progress": { 00:22:25.411 "blocks": 14336, 00:22:25.411 "percent": 22 00:22:25.411 } 00:22:25.411 }, 00:22:25.411 "base_bdevs_list": [ 00:22:25.411 { 00:22:25.411 "name": "spare", 00:22:25.411 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:25.411 "is_configured": true, 00:22:25.411 "data_offset": 2048, 00:22:25.411 "data_size": 63488 00:22:25.411 }, 00:22:25.411 { 00:22:25.411 "name": "BaseBdev2", 00:22:25.411 "uuid": "c7f2ea0e-0815-5f44-8097-ec8e36a36523", 00:22:25.411 "is_configured": true, 00:22:25.411 "data_offset": 2048, 00:22:25.411 "data_size": 63488 00:22:25.411 }, 00:22:25.411 { 00:22:25.411 "name": "BaseBdev3", 00:22:25.411 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:25.411 "is_configured": true, 00:22:25.411 "data_offset": 2048, 00:22:25.411 "data_size": 63488 00:22:25.411 }, 00:22:25.411 { 00:22:25.411 "name": "BaseBdev4", 00:22:25.411 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:25.411 "is_configured": true, 00:22:25.411 "data_offset": 2048, 00:22:25.411 "data_size": 63488 00:22:25.411 } 00:22:25.411 ] 00:22:25.411 }' 00:22:25.411 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.411 [2024-06-10 13:50:39.780512] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:25.411 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.411 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.411 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.411 13:50:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:25.671 [2024-06-10 13:50:40.023520] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:25.671 [2024-06-10 13:50:40.030794] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:25.671 [2024-06-10 13:50:40.031086] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:25.671 [2024-06-10 13:50:40.139903] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:25.671 [2024-06-10 13:50:40.143127] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:25.671 [2024-06-10 13:50:40.143150] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:25.671 [2024-06-10 13:50:40.143155] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:25.931 [2024-06-10 13:50:40.168945] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x235b9e0 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.931 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.191 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.191 "name": "raid_bdev1", 00:22:26.191 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:26.191 "strip_size_kb": 0, 00:22:26.191 "state": "online", 00:22:26.191 "raid_level": "raid1", 00:22:26.191 "superblock": true, 00:22:26.191 "num_base_bdevs": 4, 00:22:26.191 "num_base_bdevs_discovered": 3, 00:22:26.191 "num_base_bdevs_operational": 3, 00:22:26.191 "base_bdevs_list": [ 00:22:26.191 { 00:22:26.191 "name": null, 00:22:26.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.191 "is_configured": false, 00:22:26.191 "data_offset": 2048, 00:22:26.191 "data_size": 63488 00:22:26.191 }, 00:22:26.191 { 00:22:26.191 "name": "BaseBdev2", 00:22:26.191 "uuid": "c7f2ea0e-0815-5f44-8097-ec8e36a36523", 00:22:26.191 "is_configured": true, 00:22:26.191 "data_offset": 2048, 00:22:26.191 "data_size": 63488 00:22:26.191 }, 00:22:26.191 { 00:22:26.191 "name": "BaseBdev3", 00:22:26.191 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:26.191 "is_configured": true, 00:22:26.191 "data_offset": 2048, 00:22:26.191 "data_size": 63488 00:22:26.191 }, 00:22:26.191 { 00:22:26.191 "name": "BaseBdev4", 00:22:26.191 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:26.191 "is_configured": true, 00:22:26.191 "data_offset": 2048, 00:22:26.191 "data_size": 63488 00:22:26.191 } 00:22:26.191 ] 00:22:26.191 }' 00:22:26.191 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.191 13:50:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:26.761 "name": "raid_bdev1", 00:22:26.761 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:26.761 "strip_size_kb": 0, 00:22:26.761 "state": "online", 00:22:26.761 "raid_level": "raid1", 00:22:26.761 "superblock": true, 00:22:26.761 "num_base_bdevs": 4, 00:22:26.761 "num_base_bdevs_discovered": 3, 00:22:26.761 "num_base_bdevs_operational": 3, 00:22:26.761 "base_bdevs_list": [ 00:22:26.761 { 00:22:26.761 "name": null, 00:22:26.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.761 "is_configured": false, 00:22:26.761 "data_offset": 2048, 00:22:26.761 "data_size": 63488 00:22:26.761 }, 00:22:26.761 { 00:22:26.761 "name": "BaseBdev2", 00:22:26.761 "uuid": "c7f2ea0e-0815-5f44-8097-ec8e36a36523", 00:22:26.761 "is_configured": true, 00:22:26.761 "data_offset": 2048, 00:22:26.761 "data_size": 63488 00:22:26.761 }, 00:22:26.761 { 00:22:26.761 "name": "BaseBdev3", 00:22:26.761 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:26.761 "is_configured": true, 00:22:26.761 "data_offset": 2048, 00:22:26.761 "data_size": 63488 00:22:26.761 }, 00:22:26.761 { 00:22:26.761 "name": "BaseBdev4", 00:22:26.761 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:26.761 "is_configured": true, 00:22:26.761 "data_offset": 2048, 00:22:26.761 "data_size": 63488 00:22:26.761 } 00:22:26.761 ] 00:22:26.761 }' 00:22:26.761 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:27.021 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:27.021 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:27.021 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:27.021 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:27.281 [2024-06-10 13:50:41.509145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:27.281 13:50:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:27.281 [2024-06-10 13:50:41.554213] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ca870 00:22:27.281 [2024-06-10 13:50:41.555455] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:27.281 [2024-06-10 13:50:41.680245] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:27.281 [2024-06-10 13:50:41.680508] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:27.540 [2024-06-10 13:50:41.818400] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:27.802 [2024-06-10 13:50:42.162081] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:28.064 [2024-06-10 13:50:42.379124] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:28.064 [2024-06-10 13:50:42.379465] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.324 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:28.324 "name": "raid_bdev1", 00:22:28.324 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:28.324 "strip_size_kb": 0, 00:22:28.324 "state": "online", 00:22:28.324 "raid_level": "raid1", 00:22:28.324 "superblock": true, 00:22:28.324 "num_base_bdevs": 4, 00:22:28.324 "num_base_bdevs_discovered": 4, 00:22:28.324 "num_base_bdevs_operational": 4, 00:22:28.324 "process": { 00:22:28.324 "type": "rebuild", 00:22:28.324 "target": "spare", 00:22:28.324 "progress": { 00:22:28.324 "blocks": 12288, 00:22:28.324 "percent": 19 00:22:28.324 } 00:22:28.324 }, 00:22:28.324 "base_bdevs_list": [ 00:22:28.324 { 00:22:28.324 "name": "spare", 00:22:28.324 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:28.324 "is_configured": true, 00:22:28.324 "data_offset": 2048, 00:22:28.324 "data_size": 63488 00:22:28.324 }, 00:22:28.324 { 00:22:28.324 "name": "BaseBdev2", 00:22:28.324 "uuid": "c7f2ea0e-0815-5f44-8097-ec8e36a36523", 00:22:28.324 "is_configured": true, 00:22:28.324 "data_offset": 2048, 00:22:28.324 "data_size": 63488 00:22:28.324 }, 00:22:28.324 { 00:22:28.324 "name": "BaseBdev3", 00:22:28.324 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:28.324 "is_configured": true, 00:22:28.324 "data_offset": 2048, 00:22:28.324 "data_size": 63488 00:22:28.324 }, 00:22:28.324 { 00:22:28.324 "name": "BaseBdev4", 00:22:28.324 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:28.324 "is_configured": true, 00:22:28.325 "data_offset": 2048, 00:22:28.325 "data_size": 63488 00:22:28.325 } 00:22:28.325 ] 00:22:28.325 }' 00:22:28.325 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:28.325 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:28.584 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:28.584 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:28.584 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:28.584 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:28.585 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:28.585 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:22:28.585 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:28.585 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:22:28.585 13:50:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:28.585 [2024-06-10 13:50:42.911735] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:22:28.585 [2024-06-10 13:50:43.032236] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:29.154 [2024-06-10 13:50:43.364773] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x235b9e0 00:22:29.154 [2024-06-10 13:50:43.364793] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x23ca870 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.154 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:29.415 "name": "raid_bdev1", 00:22:29.415 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:29.415 "strip_size_kb": 0, 00:22:29.415 "state": "online", 00:22:29.415 "raid_level": "raid1", 00:22:29.415 "superblock": true, 00:22:29.415 "num_base_bdevs": 4, 00:22:29.415 "num_base_bdevs_discovered": 3, 00:22:29.415 "num_base_bdevs_operational": 3, 00:22:29.415 "process": { 00:22:29.415 "type": "rebuild", 00:22:29.415 "target": "spare", 00:22:29.415 "progress": { 00:22:29.415 "blocks": 22528, 00:22:29.415 "percent": 35 00:22:29.415 } 00:22:29.415 }, 00:22:29.415 "base_bdevs_list": [ 00:22:29.415 { 00:22:29.415 "name": "spare", 00:22:29.415 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 }, 00:22:29.415 { 00:22:29.415 "name": null, 00:22:29.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.415 "is_configured": false, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 }, 00:22:29.415 { 00:22:29.415 "name": "BaseBdev3", 00:22:29.415 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 }, 00:22:29.415 { 00:22:29.415 "name": "BaseBdev4", 00:22:29.415 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 } 00:22:29.415 ] 00:22:29.415 }' 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=831 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.415 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.675 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:29.675 "name": "raid_bdev1", 00:22:29.675 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:29.675 "strip_size_kb": 0, 00:22:29.675 "state": "online", 00:22:29.675 "raid_level": "raid1", 00:22:29.675 "superblock": true, 00:22:29.675 "num_base_bdevs": 4, 00:22:29.675 "num_base_bdevs_discovered": 3, 00:22:29.675 "num_base_bdevs_operational": 3, 00:22:29.675 "process": { 00:22:29.675 "type": "rebuild", 00:22:29.675 "target": "spare", 00:22:29.675 "progress": { 00:22:29.675 "blocks": 28672, 00:22:29.675 "percent": 45 00:22:29.675 } 00:22:29.675 }, 00:22:29.675 "base_bdevs_list": [ 00:22:29.675 { 00:22:29.675 "name": "spare", 00:22:29.675 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:29.675 "is_configured": true, 00:22:29.675 "data_offset": 2048, 00:22:29.675 "data_size": 63488 00:22:29.675 }, 00:22:29.675 { 00:22:29.675 "name": null, 00:22:29.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:29.675 "is_configured": false, 00:22:29.675 "data_offset": 2048, 00:22:29.675 "data_size": 63488 00:22:29.675 }, 00:22:29.675 { 00:22:29.675 "name": "BaseBdev3", 00:22:29.675 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:29.675 "is_configured": true, 00:22:29.675 "data_offset": 2048, 00:22:29.675 "data_size": 63488 00:22:29.675 }, 00:22:29.675 { 00:22:29.675 "name": "BaseBdev4", 00:22:29.675 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:29.675 "is_configured": true, 00:22:29.675 "data_offset": 2048, 00:22:29.675 "data_size": 63488 00:22:29.675 } 00:22:29.675 ] 00:22:29.675 }' 00:22:29.675 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:29.675 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:29.675 13:50:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:29.675 13:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:29.675 13:50:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:29.675 [2024-06-10 13:50:44.078612] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:22:29.935 [2024-06-10 13:50:44.196327] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:30.195 [2024-06-10 13:50:44.422119] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:22:30.195 [2024-06-10 13:50:44.630548] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.764 [2024-06-10 13:50:45.048992] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:22:30.764 [2024-06-10 13:50:45.049283] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:30.764 "name": "raid_bdev1", 00:22:30.764 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:30.764 "strip_size_kb": 0, 00:22:30.764 "state": "online", 00:22:30.764 "raid_level": "raid1", 00:22:30.764 "superblock": true, 00:22:30.764 "num_base_bdevs": 4, 00:22:30.764 "num_base_bdevs_discovered": 3, 00:22:30.764 "num_base_bdevs_operational": 3, 00:22:30.764 "process": { 00:22:30.764 "type": "rebuild", 00:22:30.764 "target": "spare", 00:22:30.764 "progress": { 00:22:30.764 "blocks": 47104, 00:22:30.764 "percent": 74 00:22:30.764 } 00:22:30.764 }, 00:22:30.764 "base_bdevs_list": [ 00:22:30.764 { 00:22:30.764 "name": "spare", 00:22:30.764 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:30.764 "is_configured": true, 00:22:30.764 "data_offset": 2048, 00:22:30.764 "data_size": 63488 00:22:30.764 }, 00:22:30.764 { 00:22:30.764 "name": null, 00:22:30.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.764 "is_configured": false, 00:22:30.764 "data_offset": 2048, 00:22:30.764 "data_size": 63488 00:22:30.764 }, 00:22:30.764 { 00:22:30.764 "name": "BaseBdev3", 00:22:30.764 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:30.764 "is_configured": true, 00:22:30.764 "data_offset": 2048, 00:22:30.764 "data_size": 63488 00:22:30.764 }, 00:22:30.764 { 00:22:30.764 "name": "BaseBdev4", 00:22:30.764 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:30.764 "is_configured": true, 00:22:30.764 "data_offset": 2048, 00:22:30.764 "data_size": 63488 00:22:30.764 } 00:22:30.764 ] 00:22:30.764 }' 00:22:30.764 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:31.024 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:31.024 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:31.024 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:31.024 13:50:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:31.024 [2024-06-10 13:50:45.470188] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:31.594 [2024-06-10 13:50:45.787798] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:31.594 [2024-06-10 13:50:46.004671] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:22:31.594 [2024-06-10 13:50:46.004814] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.163 [2024-06-10 13:50:46.340199] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:32.163 [2024-06-10 13:50:46.440468] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:32.163 [2024-06-10 13:50:46.442022] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.163 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:32.163 "name": "raid_bdev1", 00:22:32.163 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:32.163 "strip_size_kb": 0, 00:22:32.163 "state": "online", 00:22:32.163 "raid_level": "raid1", 00:22:32.163 "superblock": true, 00:22:32.163 "num_base_bdevs": 4, 00:22:32.163 "num_base_bdevs_discovered": 3, 00:22:32.163 "num_base_bdevs_operational": 3, 00:22:32.163 "base_bdevs_list": [ 00:22:32.163 { 00:22:32.163 "name": "spare", 00:22:32.163 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:32.163 "is_configured": true, 00:22:32.163 "data_offset": 2048, 00:22:32.163 "data_size": 63488 00:22:32.163 }, 00:22:32.163 { 00:22:32.163 "name": null, 00:22:32.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.163 "is_configured": false, 00:22:32.163 "data_offset": 2048, 00:22:32.163 "data_size": 63488 00:22:32.163 }, 00:22:32.163 { 00:22:32.163 "name": "BaseBdev3", 00:22:32.163 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:32.163 "is_configured": true, 00:22:32.164 "data_offset": 2048, 00:22:32.164 "data_size": 63488 00:22:32.164 }, 00:22:32.164 { 00:22:32.164 "name": "BaseBdev4", 00:22:32.164 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:32.164 "is_configured": true, 00:22:32.164 "data_offset": 2048, 00:22:32.164 "data_size": 63488 00:22:32.164 } 00:22:32.164 ] 00:22:32.164 }' 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:32.164 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:32.424 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.424 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.424 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:32.424 "name": "raid_bdev1", 00:22:32.424 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:32.424 "strip_size_kb": 0, 00:22:32.424 "state": "online", 00:22:32.424 "raid_level": "raid1", 00:22:32.424 "superblock": true, 00:22:32.424 "num_base_bdevs": 4, 00:22:32.424 "num_base_bdevs_discovered": 3, 00:22:32.424 "num_base_bdevs_operational": 3, 00:22:32.424 "base_bdevs_list": [ 00:22:32.424 { 00:22:32.424 "name": "spare", 00:22:32.424 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:32.424 "is_configured": true, 00:22:32.424 "data_offset": 2048, 00:22:32.424 "data_size": 63488 00:22:32.424 }, 00:22:32.424 { 00:22:32.424 "name": null, 00:22:32.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.424 "is_configured": false, 00:22:32.425 "data_offset": 2048, 00:22:32.425 "data_size": 63488 00:22:32.425 }, 00:22:32.425 { 00:22:32.425 "name": "BaseBdev3", 00:22:32.425 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:32.425 "is_configured": true, 00:22:32.425 "data_offset": 2048, 00:22:32.425 "data_size": 63488 00:22:32.425 }, 00:22:32.425 { 00:22:32.425 "name": "BaseBdev4", 00:22:32.425 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:32.425 "is_configured": true, 00:22:32.425 "data_offset": 2048, 00:22:32.425 "data_size": 63488 00:22:32.425 } 00:22:32.425 ] 00:22:32.425 }' 00:22:32.425 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:32.425 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:32.425 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.685 13:50:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.685 13:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.685 "name": "raid_bdev1", 00:22:32.685 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:32.685 "strip_size_kb": 0, 00:22:32.685 "state": "online", 00:22:32.685 "raid_level": "raid1", 00:22:32.685 "superblock": true, 00:22:32.685 "num_base_bdevs": 4, 00:22:32.685 "num_base_bdevs_discovered": 3, 00:22:32.685 "num_base_bdevs_operational": 3, 00:22:32.685 "base_bdevs_list": [ 00:22:32.685 { 00:22:32.685 "name": "spare", 00:22:32.685 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:32.685 "is_configured": true, 00:22:32.685 "data_offset": 2048, 00:22:32.685 "data_size": 63488 00:22:32.685 }, 00:22:32.685 { 00:22:32.685 "name": null, 00:22:32.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.685 "is_configured": false, 00:22:32.685 "data_offset": 2048, 00:22:32.685 "data_size": 63488 00:22:32.685 }, 00:22:32.685 { 00:22:32.685 "name": "BaseBdev3", 00:22:32.685 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:32.685 "is_configured": true, 00:22:32.685 "data_offset": 2048, 00:22:32.685 "data_size": 63488 00:22:32.685 }, 00:22:32.685 { 00:22:32.686 "name": "BaseBdev4", 00:22:32.686 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:32.686 "is_configured": true, 00:22:32.686 "data_offset": 2048, 00:22:32.686 "data_size": 63488 00:22:32.686 } 00:22:32.686 ] 00:22:32.686 }' 00:22:32.686 13:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.686 13:50:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:33.256 13:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:33.516 [2024-06-10 13:50:47.884707] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:33.516 [2024-06-10 13:50:47.884728] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:33.516 00:22:33.516 Latency(us) 00:22:33.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:33.516 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:33.516 raid_bdev1 : 10.53 100.71 302.13 0.00 0.00 13054.79 259.41 124955.31 00:22:33.516 =================================================================================================================== 00:22:33.516 Total : 100.71 302.13 0.00 0.00 13054.79 259.41 124955.31 00:22:33.516 [2024-06-10 13:50:47.972401] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:33.516 [2024-06-10 13:50:47.972426] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:33.516 [2024-06-10 13:50:47.972503] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:33.516 [2024-06-10 13:50:47.972510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23d4330 name raid_bdev1, state offline 00:22:33.516 0 00:22:33.775 13:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.775 13:50:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:33.775 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:34.034 /dev/nbd0 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:34.035 1+0 records in 00:22:34.035 1+0 records out 00:22:34.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265473 s, 15.4 MB/s 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.035 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:22:34.294 /dev/nbd1 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:34.294 1+0 records in 00:22:34.294 1+0 records out 00:22:34.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227528 s, 18.0 MB/s 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:34.294 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:34.295 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.295 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:34.295 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:34.295 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:34.295 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:34.295 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.555 13:50:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:22:34.815 /dev/nbd1 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:34.815 1+0 records in 00:22:34.815 1+0 records out 00:22:34.815 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268004 s, 15.3 MB/s 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:34.815 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:35.075 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:35.334 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:35.594 13:50:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:35.854 [2024-06-10 13:50:50.104424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:35.854 [2024-06-10 13:50:50.104469] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:35.854 [2024-06-10 13:50:50.104484] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d5960 00:22:35.854 [2024-06-10 13:50:50.104492] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:35.854 [2024-06-10 13:50:50.106060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:35.854 [2024-06-10 13:50:50.106083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:35.854 [2024-06-10 13:50:50.106151] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:35.854 [2024-06-10 13:50:50.106178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:35.854 [2024-06-10 13:50:50.106265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:35.854 [2024-06-10 13:50:50.106329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:35.854 spare 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.854 [2024-06-10 13:50:50.206626] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2223200 00:22:35.854 [2024-06-10 13:50:50.206637] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:35.854 [2024-06-10 13:50:50.206801] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2358c40 00:22:35.854 [2024-06-10 13:50:50.206925] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2223200 00:22:35.854 [2024-06-10 13:50:50.206931] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2223200 00:22:35.854 [2024-06-10 13:50:50.207018] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.854 "name": "raid_bdev1", 00:22:35.854 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:35.854 "strip_size_kb": 0, 00:22:35.854 "state": "online", 00:22:35.854 "raid_level": "raid1", 00:22:35.854 "superblock": true, 00:22:35.854 "num_base_bdevs": 4, 00:22:35.854 "num_base_bdevs_discovered": 3, 00:22:35.854 "num_base_bdevs_operational": 3, 00:22:35.854 "base_bdevs_list": [ 00:22:35.854 { 00:22:35.854 "name": "spare", 00:22:35.854 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:35.854 "is_configured": true, 00:22:35.854 "data_offset": 2048, 00:22:35.854 "data_size": 63488 00:22:35.854 }, 00:22:35.854 { 00:22:35.854 "name": null, 00:22:35.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.854 "is_configured": false, 00:22:35.854 "data_offset": 2048, 00:22:35.854 "data_size": 63488 00:22:35.854 }, 00:22:35.854 { 00:22:35.854 "name": "BaseBdev3", 00:22:35.854 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:35.854 "is_configured": true, 00:22:35.854 "data_offset": 2048, 00:22:35.854 "data_size": 63488 00:22:35.854 }, 00:22:35.854 { 00:22:35.854 "name": "BaseBdev4", 00:22:35.854 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:35.854 "is_configured": true, 00:22:35.854 "data_offset": 2048, 00:22:35.854 "data_size": 63488 00:22:35.854 } 00:22:35.854 ] 00:22:35.854 }' 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.854 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.426 13:50:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.687 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:36.687 "name": "raid_bdev1", 00:22:36.687 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:36.687 "strip_size_kb": 0, 00:22:36.687 "state": "online", 00:22:36.687 "raid_level": "raid1", 00:22:36.687 "superblock": true, 00:22:36.687 "num_base_bdevs": 4, 00:22:36.687 "num_base_bdevs_discovered": 3, 00:22:36.687 "num_base_bdevs_operational": 3, 00:22:36.687 "base_bdevs_list": [ 00:22:36.687 { 00:22:36.687 "name": "spare", 00:22:36.687 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:36.687 "is_configured": true, 00:22:36.687 "data_offset": 2048, 00:22:36.687 "data_size": 63488 00:22:36.687 }, 00:22:36.687 { 00:22:36.687 "name": null, 00:22:36.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.687 "is_configured": false, 00:22:36.687 "data_offset": 2048, 00:22:36.687 "data_size": 63488 00:22:36.687 }, 00:22:36.687 { 00:22:36.687 "name": "BaseBdev3", 00:22:36.687 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:36.687 "is_configured": true, 00:22:36.687 "data_offset": 2048, 00:22:36.687 "data_size": 63488 00:22:36.687 }, 00:22:36.687 { 00:22:36.687 "name": "BaseBdev4", 00:22:36.687 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:36.687 "is_configured": true, 00:22:36.687 "data_offset": 2048, 00:22:36.687 "data_size": 63488 00:22:36.687 } 00:22:36.687 ] 00:22:36.687 }' 00:22:36.687 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:36.687 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:36.687 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:36.947 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:36.947 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.947 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:36.947 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:36.947 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:37.206 [2024-06-10 13:50:51.580421] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.207 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.468 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.468 "name": "raid_bdev1", 00:22:37.468 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:37.468 "strip_size_kb": 0, 00:22:37.468 "state": "online", 00:22:37.468 "raid_level": "raid1", 00:22:37.468 "superblock": true, 00:22:37.468 "num_base_bdevs": 4, 00:22:37.468 "num_base_bdevs_discovered": 2, 00:22:37.468 "num_base_bdevs_operational": 2, 00:22:37.468 "base_bdevs_list": [ 00:22:37.468 { 00:22:37.468 "name": null, 00:22:37.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.468 "is_configured": false, 00:22:37.468 "data_offset": 2048, 00:22:37.468 "data_size": 63488 00:22:37.468 }, 00:22:37.468 { 00:22:37.468 "name": null, 00:22:37.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.468 "is_configured": false, 00:22:37.468 "data_offset": 2048, 00:22:37.468 "data_size": 63488 00:22:37.468 }, 00:22:37.468 { 00:22:37.468 "name": "BaseBdev3", 00:22:37.468 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:37.468 "is_configured": true, 00:22:37.468 "data_offset": 2048, 00:22:37.468 "data_size": 63488 00:22:37.468 }, 00:22:37.468 { 00:22:37.468 "name": "BaseBdev4", 00:22:37.468 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:37.468 "is_configured": true, 00:22:37.468 "data_offset": 2048, 00:22:37.468 "data_size": 63488 00:22:37.468 } 00:22:37.468 ] 00:22:37.468 }' 00:22:37.468 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.468 13:50:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:38.038 13:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:38.298 [2024-06-10 13:50:52.530952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:38.298 [2024-06-10 13:50:52.531063] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:38.298 [2024-06-10 13:50:52.531072] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:38.298 [2024-06-10 13:50:52.531091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:38.298 [2024-06-10 13:50:52.534183] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23584c0 00:22:38.298 [2024-06-10 13:50:52.535906] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:38.298 13:50:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.239 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.499 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:39.499 "name": "raid_bdev1", 00:22:39.499 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:39.499 "strip_size_kb": 0, 00:22:39.499 "state": "online", 00:22:39.499 "raid_level": "raid1", 00:22:39.499 "superblock": true, 00:22:39.499 "num_base_bdevs": 4, 00:22:39.499 "num_base_bdevs_discovered": 3, 00:22:39.499 "num_base_bdevs_operational": 3, 00:22:39.499 "process": { 00:22:39.499 "type": "rebuild", 00:22:39.499 "target": "spare", 00:22:39.499 "progress": { 00:22:39.499 "blocks": 24576, 00:22:39.499 "percent": 38 00:22:39.499 } 00:22:39.499 }, 00:22:39.499 "base_bdevs_list": [ 00:22:39.499 { 00:22:39.499 "name": "spare", 00:22:39.499 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:39.499 "is_configured": true, 00:22:39.499 "data_offset": 2048, 00:22:39.499 "data_size": 63488 00:22:39.499 }, 00:22:39.499 { 00:22:39.499 "name": null, 00:22:39.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.499 "is_configured": false, 00:22:39.499 "data_offset": 2048, 00:22:39.499 "data_size": 63488 00:22:39.499 }, 00:22:39.499 { 00:22:39.499 "name": "BaseBdev3", 00:22:39.499 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:39.499 "is_configured": true, 00:22:39.499 "data_offset": 2048, 00:22:39.499 "data_size": 63488 00:22:39.499 }, 00:22:39.499 { 00:22:39.499 "name": "BaseBdev4", 00:22:39.499 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:39.499 "is_configured": true, 00:22:39.499 "data_offset": 2048, 00:22:39.499 "data_size": 63488 00:22:39.499 } 00:22:39.499 ] 00:22:39.499 }' 00:22:39.499 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:39.499 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:39.499 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:39.499 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:39.499 13:50:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:39.760 [2024-06-10 13:50:54.045320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:39.760 [2024-06-10 13:50:54.145598] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:39.760 [2024-06-10 13:50:54.145632] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:39.760 [2024-06-10 13:50:54.145644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:39.760 [2024-06-10 13:50:54.145648] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.760 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.029 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.029 "name": "raid_bdev1", 00:22:40.029 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:40.029 "strip_size_kb": 0, 00:22:40.029 "state": "online", 00:22:40.029 "raid_level": "raid1", 00:22:40.029 "superblock": true, 00:22:40.029 "num_base_bdevs": 4, 00:22:40.029 "num_base_bdevs_discovered": 2, 00:22:40.029 "num_base_bdevs_operational": 2, 00:22:40.029 "base_bdevs_list": [ 00:22:40.029 { 00:22:40.029 "name": null, 00:22:40.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.029 "is_configured": false, 00:22:40.029 "data_offset": 2048, 00:22:40.029 "data_size": 63488 00:22:40.029 }, 00:22:40.029 { 00:22:40.029 "name": null, 00:22:40.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.029 "is_configured": false, 00:22:40.029 "data_offset": 2048, 00:22:40.029 "data_size": 63488 00:22:40.029 }, 00:22:40.029 { 00:22:40.029 "name": "BaseBdev3", 00:22:40.029 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:40.029 "is_configured": true, 00:22:40.029 "data_offset": 2048, 00:22:40.029 "data_size": 63488 00:22:40.029 }, 00:22:40.029 { 00:22:40.029 "name": "BaseBdev4", 00:22:40.029 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:40.029 "is_configured": true, 00:22:40.029 "data_offset": 2048, 00:22:40.029 "data_size": 63488 00:22:40.029 } 00:22:40.029 ] 00:22:40.029 }' 00:22:40.029 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.029 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:40.743 13:50:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:40.743 [2024-06-10 13:50:55.140205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:40.743 [2024-06-10 13:50:55.140241] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.743 [2024-06-10 13:50:55.140258] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2223670 00:22:40.743 [2024-06-10 13:50:55.140266] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.743 [2024-06-10 13:50:55.140578] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.743 [2024-06-10 13:50:55.140591] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:40.743 [2024-06-10 13:50:55.140654] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:40.743 [2024-06-10 13:50:55.140662] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:22:40.743 [2024-06-10 13:50:55.140668] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:40.743 [2024-06-10 13:50:55.140679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:40.743 [2024-06-10 13:50:55.143694] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23cf5d0 00:22:40.743 [2024-06-10 13:50:55.144919] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:40.743 spare 00:22:40.743 13:50:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:42.126 "name": "raid_bdev1", 00:22:42.126 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:42.126 "strip_size_kb": 0, 00:22:42.126 "state": "online", 00:22:42.126 "raid_level": "raid1", 00:22:42.126 "superblock": true, 00:22:42.126 "num_base_bdevs": 4, 00:22:42.126 "num_base_bdevs_discovered": 3, 00:22:42.126 "num_base_bdevs_operational": 3, 00:22:42.126 "process": { 00:22:42.126 "type": "rebuild", 00:22:42.126 "target": "spare", 00:22:42.126 "progress": { 00:22:42.126 "blocks": 24576, 00:22:42.126 "percent": 38 00:22:42.126 } 00:22:42.126 }, 00:22:42.126 "base_bdevs_list": [ 00:22:42.126 { 00:22:42.126 "name": "spare", 00:22:42.126 "uuid": "091337fe-e286-5089-8df2-a10164f081bd", 00:22:42.126 "is_configured": true, 00:22:42.126 "data_offset": 2048, 00:22:42.126 "data_size": 63488 00:22:42.126 }, 00:22:42.126 { 00:22:42.126 "name": null, 00:22:42.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.126 "is_configured": false, 00:22:42.126 "data_offset": 2048, 00:22:42.126 "data_size": 63488 00:22:42.126 }, 00:22:42.126 { 00:22:42.126 "name": "BaseBdev3", 00:22:42.126 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:42.126 "is_configured": true, 00:22:42.126 "data_offset": 2048, 00:22:42.126 "data_size": 63488 00:22:42.126 }, 00:22:42.126 { 00:22:42.126 "name": "BaseBdev4", 00:22:42.126 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:42.126 "is_configured": true, 00:22:42.126 "data_offset": 2048, 00:22:42.126 "data_size": 63488 00:22:42.126 } 00:22:42.126 ] 00:22:42.126 }' 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:42.126 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:42.387 [2024-06-10 13:50:56.654334] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:42.387 [2024-06-10 13:50:56.754610] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:42.387 [2024-06-10 13:50:56.754644] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.387 [2024-06-10 13:50:56.754655] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:42.387 [2024-06-10 13:50:56.754659] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.387 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.647 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.647 "name": "raid_bdev1", 00:22:42.647 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:42.647 "strip_size_kb": 0, 00:22:42.647 "state": "online", 00:22:42.647 "raid_level": "raid1", 00:22:42.647 "superblock": true, 00:22:42.647 "num_base_bdevs": 4, 00:22:42.647 "num_base_bdevs_discovered": 2, 00:22:42.647 "num_base_bdevs_operational": 2, 00:22:42.647 "base_bdevs_list": [ 00:22:42.647 { 00:22:42.647 "name": null, 00:22:42.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.647 "is_configured": false, 00:22:42.647 "data_offset": 2048, 00:22:42.647 "data_size": 63488 00:22:42.647 }, 00:22:42.647 { 00:22:42.647 "name": null, 00:22:42.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.647 "is_configured": false, 00:22:42.647 "data_offset": 2048, 00:22:42.647 "data_size": 63488 00:22:42.647 }, 00:22:42.647 { 00:22:42.647 "name": "BaseBdev3", 00:22:42.647 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:42.647 "is_configured": true, 00:22:42.647 "data_offset": 2048, 00:22:42.647 "data_size": 63488 00:22:42.647 }, 00:22:42.647 { 00:22:42.647 "name": "BaseBdev4", 00:22:42.647 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:42.647 "is_configured": true, 00:22:42.647 "data_offset": 2048, 00:22:42.647 "data_size": 63488 00:22:42.647 } 00:22:42.647 ] 00:22:42.647 }' 00:22:42.647 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.647 13:50:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.217 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.477 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.477 "name": "raid_bdev1", 00:22:43.477 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:43.477 "strip_size_kb": 0, 00:22:43.477 "state": "online", 00:22:43.477 "raid_level": "raid1", 00:22:43.477 "superblock": true, 00:22:43.477 "num_base_bdevs": 4, 00:22:43.477 "num_base_bdevs_discovered": 2, 00:22:43.477 "num_base_bdevs_operational": 2, 00:22:43.477 "base_bdevs_list": [ 00:22:43.477 { 00:22:43.477 "name": null, 00:22:43.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.477 "is_configured": false, 00:22:43.477 "data_offset": 2048, 00:22:43.477 "data_size": 63488 00:22:43.477 }, 00:22:43.477 { 00:22:43.477 "name": null, 00:22:43.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:43.477 "is_configured": false, 00:22:43.477 "data_offset": 2048, 00:22:43.477 "data_size": 63488 00:22:43.477 }, 00:22:43.477 { 00:22:43.477 "name": "BaseBdev3", 00:22:43.477 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:43.477 "is_configured": true, 00:22:43.477 "data_offset": 2048, 00:22:43.477 "data_size": 63488 00:22:43.477 }, 00:22:43.477 { 00:22:43.477 "name": "BaseBdev4", 00:22:43.477 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:43.477 "is_configured": true, 00:22:43.477 "data_offset": 2048, 00:22:43.477 "data_size": 63488 00:22:43.477 } 00:22:43.477 ] 00:22:43.477 }' 00:22:43.477 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.477 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:43.477 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.477 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:43.477 13:50:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:43.737 13:50:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:43.737 [2024-06-10 13:50:58.194402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:43.737 [2024-06-10 13:50:58.194437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:43.737 [2024-06-10 13:50:58.194451] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ca450 00:22:43.737 [2024-06-10 13:50:58.194458] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:43.737 [2024-06-10 13:50:58.194751] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:43.737 [2024-06-10 13:50:58.194763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:43.737 [2024-06-10 13:50:58.194811] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:43.737 [2024-06-10 13:50:58.194818] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:43.737 [2024-06-10 13:50:58.194824] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:43.737 BaseBdev1 00:22:43.738 13:50:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:45.119 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:45.120 "name": "raid_bdev1", 00:22:45.120 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:45.120 "strip_size_kb": 0, 00:22:45.120 "state": "online", 00:22:45.120 "raid_level": "raid1", 00:22:45.120 "superblock": true, 00:22:45.120 "num_base_bdevs": 4, 00:22:45.120 "num_base_bdevs_discovered": 2, 00:22:45.120 "num_base_bdevs_operational": 2, 00:22:45.120 "base_bdevs_list": [ 00:22:45.120 { 00:22:45.120 "name": null, 00:22:45.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.120 "is_configured": false, 00:22:45.120 "data_offset": 2048, 00:22:45.120 "data_size": 63488 00:22:45.120 }, 00:22:45.120 { 00:22:45.120 "name": null, 00:22:45.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.120 "is_configured": false, 00:22:45.120 "data_offset": 2048, 00:22:45.120 "data_size": 63488 00:22:45.120 }, 00:22:45.120 { 00:22:45.120 "name": "BaseBdev3", 00:22:45.120 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:45.120 "is_configured": true, 00:22:45.120 "data_offset": 2048, 00:22:45.120 "data_size": 63488 00:22:45.120 }, 00:22:45.120 { 00:22:45.120 "name": "BaseBdev4", 00:22:45.120 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:45.120 "is_configured": true, 00:22:45.120 "data_offset": 2048, 00:22:45.120 "data_size": 63488 00:22:45.120 } 00:22:45.120 ] 00:22:45.120 }' 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:45.120 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.689 13:50:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.949 "name": "raid_bdev1", 00:22:45.949 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:45.949 "strip_size_kb": 0, 00:22:45.949 "state": "online", 00:22:45.949 "raid_level": "raid1", 00:22:45.949 "superblock": true, 00:22:45.949 "num_base_bdevs": 4, 00:22:45.949 "num_base_bdevs_discovered": 2, 00:22:45.949 "num_base_bdevs_operational": 2, 00:22:45.949 "base_bdevs_list": [ 00:22:45.949 { 00:22:45.949 "name": null, 00:22:45.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.949 "is_configured": false, 00:22:45.949 "data_offset": 2048, 00:22:45.949 "data_size": 63488 00:22:45.949 }, 00:22:45.949 { 00:22:45.949 "name": null, 00:22:45.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.949 "is_configured": false, 00:22:45.949 "data_offset": 2048, 00:22:45.949 "data_size": 63488 00:22:45.949 }, 00:22:45.949 { 00:22:45.949 "name": "BaseBdev3", 00:22:45.949 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:45.949 "is_configured": true, 00:22:45.949 "data_offset": 2048, 00:22:45.949 "data_size": 63488 00:22:45.949 }, 00:22:45.949 { 00:22:45.949 "name": "BaseBdev4", 00:22:45.949 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:45.949 "is_configured": true, 00:22:45.949 "data_offset": 2048, 00:22:45.949 "data_size": 63488 00:22:45.949 } 00:22:45.949 ] 00:22:45.949 }' 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:45.949 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:45.950 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:46.209 [2024-06-10 13:51:00.464436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:46.209 [2024-06-10 13:51:00.464541] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:22:46.209 [2024-06-10 13:51:00.464550] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:46.209 request: 00:22:46.209 { 00:22:46.209 "raid_bdev": "raid_bdev1", 00:22:46.209 "base_bdev": "BaseBdev1", 00:22:46.209 "method": "bdev_raid_add_base_bdev", 00:22:46.209 "req_id": 1 00:22:46.209 } 00:22:46.209 Got JSON-RPC error response 00:22:46.209 response: 00:22:46.209 { 00:22:46.209 "code": -22, 00:22:46.209 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:46.209 } 00:22:46.209 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:22:46.209 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:22:46.209 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:22:46.209 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:22:46.209 13:51:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.149 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.409 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:47.409 "name": "raid_bdev1", 00:22:47.409 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:47.409 "strip_size_kb": 0, 00:22:47.409 "state": "online", 00:22:47.409 "raid_level": "raid1", 00:22:47.409 "superblock": true, 00:22:47.409 "num_base_bdevs": 4, 00:22:47.409 "num_base_bdevs_discovered": 2, 00:22:47.409 "num_base_bdevs_operational": 2, 00:22:47.409 "base_bdevs_list": [ 00:22:47.409 { 00:22:47.409 "name": null, 00:22:47.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.409 "is_configured": false, 00:22:47.409 "data_offset": 2048, 00:22:47.409 "data_size": 63488 00:22:47.409 }, 00:22:47.409 { 00:22:47.409 "name": null, 00:22:47.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.409 "is_configured": false, 00:22:47.409 "data_offset": 2048, 00:22:47.409 "data_size": 63488 00:22:47.409 }, 00:22:47.409 { 00:22:47.409 "name": "BaseBdev3", 00:22:47.409 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:47.409 "is_configured": true, 00:22:47.409 "data_offset": 2048, 00:22:47.409 "data_size": 63488 00:22:47.409 }, 00:22:47.409 { 00:22:47.409 "name": "BaseBdev4", 00:22:47.409 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:47.409 "is_configured": true, 00:22:47.409 "data_offset": 2048, 00:22:47.409 "data_size": 63488 00:22:47.409 } 00:22:47.409 ] 00:22:47.409 }' 00:22:47.409 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:47.410 13:51:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.978 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.238 "name": "raid_bdev1", 00:22:48.238 "uuid": "7dfe05e9-ba62-4880-a287-17843edc88cd", 00:22:48.238 "strip_size_kb": 0, 00:22:48.238 "state": "online", 00:22:48.238 "raid_level": "raid1", 00:22:48.238 "superblock": true, 00:22:48.238 "num_base_bdevs": 4, 00:22:48.238 "num_base_bdevs_discovered": 2, 00:22:48.238 "num_base_bdevs_operational": 2, 00:22:48.238 "base_bdevs_list": [ 00:22:48.238 { 00:22:48.238 "name": null, 00:22:48.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.238 "is_configured": false, 00:22:48.238 "data_offset": 2048, 00:22:48.238 "data_size": 63488 00:22:48.238 }, 00:22:48.238 { 00:22:48.238 "name": null, 00:22:48.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.238 "is_configured": false, 00:22:48.238 "data_offset": 2048, 00:22:48.238 "data_size": 63488 00:22:48.238 }, 00:22:48.238 { 00:22:48.238 "name": "BaseBdev3", 00:22:48.238 "uuid": "e2548c45-33b2-567c-9a30-3a4888c0ac6b", 00:22:48.238 "is_configured": true, 00:22:48.238 "data_offset": 2048, 00:22:48.238 "data_size": 63488 00:22:48.238 }, 00:22:48.238 { 00:22:48.238 "name": "BaseBdev4", 00:22:48.238 "uuid": "bee08510-5dfe-5889-aab3-2acc1012e4e0", 00:22:48.238 "is_configured": true, 00:22:48.238 "data_offset": 2048, 00:22:48.238 "data_size": 63488 00:22:48.238 } 00:22:48.238 ] 00:22:48.238 }' 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1656671 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 1656671 ']' 00:22:48.238 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 1656671 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1656671 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1656671' 00:22:48.239 killing process with pid 1656671 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 1656671 00:22:48.239 Received shutdown signal, test time was about 25.137171 seconds 00:22:48.239 00:22:48.239 Latency(us) 00:22:48.239 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:48.239 =================================================================================================================== 00:22:48.239 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:48.239 [2024-06-10 13:51:02.614975] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:48.239 [2024-06-10 13:51:02.615060] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:48.239 [2024-06-10 13:51:02.615108] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:48.239 [2024-06-10 13:51:02.615116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2223200 name raid_bdev1, state offline 00:22:48.239 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 1656671 00:22:48.239 [2024-06-10 13:51:02.639271] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:48.500 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:22:48.500 00:22:48.500 real 0m29.979s 00:22:48.500 user 0m47.611s 00:22:48.500 sys 0m3.619s 00:22:48.500 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:48.500 13:51:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:48.500 ************************************ 00:22:48.500 END TEST raid_rebuild_test_sb_io 00:22:48.500 ************************************ 00:22:48.500 13:51:02 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:22:48.500 13:51:02 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:22:48.500 13:51:02 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:22:48.500 13:51:02 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:22:48.500 13:51:02 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:48.500 13:51:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:48.500 ************************************ 00:22:48.500 START TEST raid_state_function_test_sb_4k 00:22:48.500 ************************************ 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1662942 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1662942' 00:22:48.500 Process raid pid: 1662942 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1662942 /var/tmp/spdk-raid.sock 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 1662942 ']' 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:48.500 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:48.500 13:51:02 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:48.500 [2024-06-10 13:51:02.913827] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:22:48.500 [2024-06-10 13:51:02.913881] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:48.759 [2024-06-10 13:51:03.005733] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:48.759 [2024-06-10 13:51:03.084828] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:22:48.759 [2024-06-10 13:51:03.125306] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:48.759 [2024-06-10 13:51:03.125332] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:49.328 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:49.328 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:22:49.328 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:49.587 [2024-06-10 13:51:03.861691] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:49.587 [2024-06-10 13:51:03.861722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:49.588 [2024-06-10 13:51:03.861728] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:49.588 [2024-06-10 13:51:03.861734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.588 13:51:03 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:49.588 13:51:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.588 "name": "Existed_Raid", 00:22:49.588 "uuid": "4aa58a2c-cd82-4355-8a96-80573dba4f7f", 00:22:49.588 "strip_size_kb": 0, 00:22:49.588 "state": "configuring", 00:22:49.588 "raid_level": "raid1", 00:22:49.588 "superblock": true, 00:22:49.588 "num_base_bdevs": 2, 00:22:49.588 "num_base_bdevs_discovered": 0, 00:22:49.588 "num_base_bdevs_operational": 2, 00:22:49.588 "base_bdevs_list": [ 00:22:49.588 { 00:22:49.588 "name": "BaseBdev1", 00:22:49.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.588 "is_configured": false, 00:22:49.588 "data_offset": 0, 00:22:49.588 "data_size": 0 00:22:49.588 }, 00:22:49.588 { 00:22:49.588 "name": "BaseBdev2", 00:22:49.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:49.588 "is_configured": false, 00:22:49.588 "data_offset": 0, 00:22:49.588 "data_size": 0 00:22:49.588 } 00:22:49.588 ] 00:22:49.588 }' 00:22:49.588 13:51:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.588 13:51:04 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:50.157 13:51:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:50.417 [2024-06-10 13:51:04.771871] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:50.417 [2024-06-10 13:51:04.771891] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107d720 name Existed_Raid, state configuring 00:22:50.417 13:51:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:50.677 [2024-06-10 13:51:04.928294] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:50.677 [2024-06-10 13:51:04.928318] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:50.677 [2024-06-10 13:51:04.928324] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:50.677 [2024-06-10 13:51:04.928330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:50.677 13:51:04 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:22:50.677 [2024-06-10 13:51:05.139654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:50.677 BaseBdev1 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:50.937 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:51.196 [ 00:22:51.196 { 00:22:51.196 "name": "BaseBdev1", 00:22:51.196 "aliases": [ 00:22:51.196 "2d6954ba-99ef-4f77-a453-966c55cf23de" 00:22:51.196 ], 00:22:51.196 "product_name": "Malloc disk", 00:22:51.196 "block_size": 4096, 00:22:51.196 "num_blocks": 8192, 00:22:51.196 "uuid": "2d6954ba-99ef-4f77-a453-966c55cf23de", 00:22:51.196 "assigned_rate_limits": { 00:22:51.196 "rw_ios_per_sec": 0, 00:22:51.196 "rw_mbytes_per_sec": 0, 00:22:51.196 "r_mbytes_per_sec": 0, 00:22:51.196 "w_mbytes_per_sec": 0 00:22:51.196 }, 00:22:51.196 "claimed": true, 00:22:51.196 "claim_type": "exclusive_write", 00:22:51.196 "zoned": false, 00:22:51.196 "supported_io_types": { 00:22:51.196 "read": true, 00:22:51.196 "write": true, 00:22:51.196 "unmap": true, 00:22:51.196 "write_zeroes": true, 00:22:51.196 "flush": true, 00:22:51.196 "reset": true, 00:22:51.196 "compare": false, 00:22:51.196 "compare_and_write": false, 00:22:51.196 "abort": true, 00:22:51.196 "nvme_admin": false, 00:22:51.196 "nvme_io": false 00:22:51.196 }, 00:22:51.196 "memory_domains": [ 00:22:51.196 { 00:22:51.196 "dma_device_id": "system", 00:22:51.196 "dma_device_type": 1 00:22:51.196 }, 00:22:51.196 { 00:22:51.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.196 "dma_device_type": 2 00:22:51.196 } 00:22:51.196 ], 00:22:51.196 "driver_specific": {} 00:22:51.196 } 00:22:51.196 ] 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.196 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:51.457 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:51.457 "name": "Existed_Raid", 00:22:51.457 "uuid": "5adcc885-586c-471c-ac89-fd4bc1726532", 00:22:51.457 "strip_size_kb": 0, 00:22:51.457 "state": "configuring", 00:22:51.457 "raid_level": "raid1", 00:22:51.457 "superblock": true, 00:22:51.457 "num_base_bdevs": 2, 00:22:51.457 "num_base_bdevs_discovered": 1, 00:22:51.457 "num_base_bdevs_operational": 2, 00:22:51.457 "base_bdevs_list": [ 00:22:51.457 { 00:22:51.457 "name": "BaseBdev1", 00:22:51.457 "uuid": "2d6954ba-99ef-4f77-a453-966c55cf23de", 00:22:51.457 "is_configured": true, 00:22:51.457 "data_offset": 256, 00:22:51.457 "data_size": 7936 00:22:51.457 }, 00:22:51.457 { 00:22:51.457 "name": "BaseBdev2", 00:22:51.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.457 "is_configured": false, 00:22:51.457 "data_offset": 0, 00:22:51.457 "data_size": 0 00:22:51.457 } 00:22:51.457 ] 00:22:51.457 }' 00:22:51.457 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:51.457 13:51:05 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:52.026 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:52.286 [2024-06-10 13:51:06.515133] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:52.286 [2024-06-10 13:51:06.515165] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107d010 name Existed_Raid, state configuring 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:52.286 [2024-06-10 13:51:06.719673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:52.286 [2024-06-10 13:51:06.720898] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:52.286 [2024-06-10 13:51:06.720923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.286 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:52.545 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.545 "name": "Existed_Raid", 00:22:52.545 "uuid": "1c61258f-d3a2-4f17-ac21-618d1af705a5", 00:22:52.545 "strip_size_kb": 0, 00:22:52.545 "state": "configuring", 00:22:52.545 "raid_level": "raid1", 00:22:52.545 "superblock": true, 00:22:52.545 "num_base_bdevs": 2, 00:22:52.545 "num_base_bdevs_discovered": 1, 00:22:52.545 "num_base_bdevs_operational": 2, 00:22:52.545 "base_bdevs_list": [ 00:22:52.545 { 00:22:52.545 "name": "BaseBdev1", 00:22:52.545 "uuid": "2d6954ba-99ef-4f77-a453-966c55cf23de", 00:22:52.545 "is_configured": true, 00:22:52.545 "data_offset": 256, 00:22:52.545 "data_size": 7936 00:22:52.545 }, 00:22:52.545 { 00:22:52.545 "name": "BaseBdev2", 00:22:52.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.545 "is_configured": false, 00:22:52.545 "data_offset": 0, 00:22:52.545 "data_size": 0 00:22:52.545 } 00:22:52.545 ] 00:22:52.545 }' 00:22:52.545 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.545 13:51:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:53.116 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:22:53.376 [2024-06-10 13:51:07.687215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:53.376 [2024-06-10 13:51:07.687331] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x107de00 00:22:53.376 [2024-06-10 13:51:07.687339] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:53.376 [2024-06-10 13:51:07.687490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107f260 00:22:53.376 [2024-06-10 13:51:07.687585] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x107de00 00:22:53.376 [2024-06-10 13:51:07.687591] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x107de00 00:22:53.376 [2024-06-10 13:51:07.687667] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:53.376 BaseBdev2 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:22:53.376 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:53.637 13:51:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:53.637 [ 00:22:53.637 { 00:22:53.637 "name": "BaseBdev2", 00:22:53.637 "aliases": [ 00:22:53.637 "955bd8c1-8f50-4eb5-a2cb-850017e92ea9" 00:22:53.637 ], 00:22:53.637 "product_name": "Malloc disk", 00:22:53.637 "block_size": 4096, 00:22:53.637 "num_blocks": 8192, 00:22:53.637 "uuid": "955bd8c1-8f50-4eb5-a2cb-850017e92ea9", 00:22:53.637 "assigned_rate_limits": { 00:22:53.637 "rw_ios_per_sec": 0, 00:22:53.637 "rw_mbytes_per_sec": 0, 00:22:53.637 "r_mbytes_per_sec": 0, 00:22:53.637 "w_mbytes_per_sec": 0 00:22:53.637 }, 00:22:53.637 "claimed": true, 00:22:53.637 "claim_type": "exclusive_write", 00:22:53.637 "zoned": false, 00:22:53.637 "supported_io_types": { 00:22:53.637 "read": true, 00:22:53.637 "write": true, 00:22:53.637 "unmap": true, 00:22:53.637 "write_zeroes": true, 00:22:53.637 "flush": true, 00:22:53.637 "reset": true, 00:22:53.637 "compare": false, 00:22:53.637 "compare_and_write": false, 00:22:53.637 "abort": true, 00:22:53.637 "nvme_admin": false, 00:22:53.637 "nvme_io": false 00:22:53.637 }, 00:22:53.637 "memory_domains": [ 00:22:53.637 { 00:22:53.637 "dma_device_id": "system", 00:22:53.637 "dma_device_type": 1 00:22:53.637 }, 00:22:53.637 { 00:22:53.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:53.637 "dma_device_type": 2 00:22:53.637 } 00:22:53.637 ], 00:22:53.637 "driver_specific": {} 00:22:53.637 } 00:22:53.637 ] 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.637 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:53.897 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.897 "name": "Existed_Raid", 00:22:53.897 "uuid": "1c61258f-d3a2-4f17-ac21-618d1af705a5", 00:22:53.897 "strip_size_kb": 0, 00:22:53.897 "state": "online", 00:22:53.897 "raid_level": "raid1", 00:22:53.897 "superblock": true, 00:22:53.897 "num_base_bdevs": 2, 00:22:53.897 "num_base_bdevs_discovered": 2, 00:22:53.897 "num_base_bdevs_operational": 2, 00:22:53.897 "base_bdevs_list": [ 00:22:53.897 { 00:22:53.897 "name": "BaseBdev1", 00:22:53.897 "uuid": "2d6954ba-99ef-4f77-a453-966c55cf23de", 00:22:53.897 "is_configured": true, 00:22:53.897 "data_offset": 256, 00:22:53.897 "data_size": 7936 00:22:53.897 }, 00:22:53.897 { 00:22:53.897 "name": "BaseBdev2", 00:22:53.897 "uuid": "955bd8c1-8f50-4eb5-a2cb-850017e92ea9", 00:22:53.897 "is_configured": true, 00:22:53.897 "data_offset": 256, 00:22:53.897 "data_size": 7936 00:22:53.897 } 00:22:53.897 ] 00:22:53.897 }' 00:22:53.897 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.897 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:54.467 13:51:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:54.727 [2024-06-10 13:51:09.034835] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:54.727 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:54.727 "name": "Existed_Raid", 00:22:54.727 "aliases": [ 00:22:54.727 "1c61258f-d3a2-4f17-ac21-618d1af705a5" 00:22:54.727 ], 00:22:54.727 "product_name": "Raid Volume", 00:22:54.727 "block_size": 4096, 00:22:54.727 "num_blocks": 7936, 00:22:54.727 "uuid": "1c61258f-d3a2-4f17-ac21-618d1af705a5", 00:22:54.728 "assigned_rate_limits": { 00:22:54.728 "rw_ios_per_sec": 0, 00:22:54.728 "rw_mbytes_per_sec": 0, 00:22:54.728 "r_mbytes_per_sec": 0, 00:22:54.728 "w_mbytes_per_sec": 0 00:22:54.728 }, 00:22:54.728 "claimed": false, 00:22:54.728 "zoned": false, 00:22:54.728 "supported_io_types": { 00:22:54.728 "read": true, 00:22:54.728 "write": true, 00:22:54.728 "unmap": false, 00:22:54.728 "write_zeroes": true, 00:22:54.728 "flush": false, 00:22:54.728 "reset": true, 00:22:54.728 "compare": false, 00:22:54.728 "compare_and_write": false, 00:22:54.728 "abort": false, 00:22:54.728 "nvme_admin": false, 00:22:54.728 "nvme_io": false 00:22:54.728 }, 00:22:54.728 "memory_domains": [ 00:22:54.728 { 00:22:54.728 "dma_device_id": "system", 00:22:54.728 "dma_device_type": 1 00:22:54.728 }, 00:22:54.728 { 00:22:54.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.728 "dma_device_type": 2 00:22:54.728 }, 00:22:54.728 { 00:22:54.728 "dma_device_id": "system", 00:22:54.728 "dma_device_type": 1 00:22:54.728 }, 00:22:54.728 { 00:22:54.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.728 "dma_device_type": 2 00:22:54.728 } 00:22:54.728 ], 00:22:54.728 "driver_specific": { 00:22:54.728 "raid": { 00:22:54.728 "uuid": "1c61258f-d3a2-4f17-ac21-618d1af705a5", 00:22:54.728 "strip_size_kb": 0, 00:22:54.728 "state": "online", 00:22:54.728 "raid_level": "raid1", 00:22:54.728 "superblock": true, 00:22:54.728 "num_base_bdevs": 2, 00:22:54.728 "num_base_bdevs_discovered": 2, 00:22:54.728 "num_base_bdevs_operational": 2, 00:22:54.728 "base_bdevs_list": [ 00:22:54.728 { 00:22:54.728 "name": "BaseBdev1", 00:22:54.728 "uuid": "2d6954ba-99ef-4f77-a453-966c55cf23de", 00:22:54.728 "is_configured": true, 00:22:54.728 "data_offset": 256, 00:22:54.728 "data_size": 7936 00:22:54.728 }, 00:22:54.728 { 00:22:54.728 "name": "BaseBdev2", 00:22:54.728 "uuid": "955bd8c1-8f50-4eb5-a2cb-850017e92ea9", 00:22:54.728 "is_configured": true, 00:22:54.728 "data_offset": 256, 00:22:54.728 "data_size": 7936 00:22:54.728 } 00:22:54.728 ] 00:22:54.728 } 00:22:54.728 } 00:22:54.728 }' 00:22:54.728 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:54.728 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:54.728 BaseBdev2' 00:22:54.728 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:54.728 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:54.728 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:54.988 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:54.988 "name": "BaseBdev1", 00:22:54.988 "aliases": [ 00:22:54.988 "2d6954ba-99ef-4f77-a453-966c55cf23de" 00:22:54.988 ], 00:22:54.988 "product_name": "Malloc disk", 00:22:54.988 "block_size": 4096, 00:22:54.988 "num_blocks": 8192, 00:22:54.988 "uuid": "2d6954ba-99ef-4f77-a453-966c55cf23de", 00:22:54.988 "assigned_rate_limits": { 00:22:54.988 "rw_ios_per_sec": 0, 00:22:54.988 "rw_mbytes_per_sec": 0, 00:22:54.988 "r_mbytes_per_sec": 0, 00:22:54.988 "w_mbytes_per_sec": 0 00:22:54.988 }, 00:22:54.988 "claimed": true, 00:22:54.988 "claim_type": "exclusive_write", 00:22:54.988 "zoned": false, 00:22:54.988 "supported_io_types": { 00:22:54.988 "read": true, 00:22:54.988 "write": true, 00:22:54.988 "unmap": true, 00:22:54.988 "write_zeroes": true, 00:22:54.988 "flush": true, 00:22:54.988 "reset": true, 00:22:54.988 "compare": false, 00:22:54.988 "compare_and_write": false, 00:22:54.988 "abort": true, 00:22:54.988 "nvme_admin": false, 00:22:54.988 "nvme_io": false 00:22:54.988 }, 00:22:54.988 "memory_domains": [ 00:22:54.988 { 00:22:54.988 "dma_device_id": "system", 00:22:54.988 "dma_device_type": 1 00:22:54.988 }, 00:22:54.988 { 00:22:54.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.988 "dma_device_type": 2 00:22:54.988 } 00:22:54.988 ], 00:22:54.988 "driver_specific": {} 00:22:54.988 }' 00:22:54.988 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.988 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.988 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:54.988 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:54.988 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:55.249 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:55.509 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:55.509 "name": "BaseBdev2", 00:22:55.509 "aliases": [ 00:22:55.509 "955bd8c1-8f50-4eb5-a2cb-850017e92ea9" 00:22:55.509 ], 00:22:55.509 "product_name": "Malloc disk", 00:22:55.509 "block_size": 4096, 00:22:55.509 "num_blocks": 8192, 00:22:55.509 "uuid": "955bd8c1-8f50-4eb5-a2cb-850017e92ea9", 00:22:55.509 "assigned_rate_limits": { 00:22:55.509 "rw_ios_per_sec": 0, 00:22:55.509 "rw_mbytes_per_sec": 0, 00:22:55.509 "r_mbytes_per_sec": 0, 00:22:55.509 "w_mbytes_per_sec": 0 00:22:55.509 }, 00:22:55.509 "claimed": true, 00:22:55.509 "claim_type": "exclusive_write", 00:22:55.509 "zoned": false, 00:22:55.509 "supported_io_types": { 00:22:55.509 "read": true, 00:22:55.509 "write": true, 00:22:55.509 "unmap": true, 00:22:55.509 "write_zeroes": true, 00:22:55.509 "flush": true, 00:22:55.509 "reset": true, 00:22:55.509 "compare": false, 00:22:55.509 "compare_and_write": false, 00:22:55.509 "abort": true, 00:22:55.509 "nvme_admin": false, 00:22:55.509 "nvme_io": false 00:22:55.509 }, 00:22:55.509 "memory_domains": [ 00:22:55.509 { 00:22:55.509 "dma_device_id": "system", 00:22:55.509 "dma_device_type": 1 00:22:55.509 }, 00:22:55.509 { 00:22:55.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.509 "dma_device_type": 2 00:22:55.509 } 00:22:55.509 ], 00:22:55.509 "driver_specific": {} 00:22:55.509 }' 00:22:55.509 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:55.509 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:55.509 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:22:55.509 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.509 13:51:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:55.771 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:56.031 [2024-06-10 13:51:10.309950] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.031 "name": "Existed_Raid", 00:22:56.031 "uuid": "1c61258f-d3a2-4f17-ac21-618d1af705a5", 00:22:56.031 "strip_size_kb": 0, 00:22:56.031 "state": "online", 00:22:56.031 "raid_level": "raid1", 00:22:56.031 "superblock": true, 00:22:56.031 "num_base_bdevs": 2, 00:22:56.031 "num_base_bdevs_discovered": 1, 00:22:56.031 "num_base_bdevs_operational": 1, 00:22:56.031 "base_bdevs_list": [ 00:22:56.031 { 00:22:56.031 "name": null, 00:22:56.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.031 "is_configured": false, 00:22:56.031 "data_offset": 256, 00:22:56.031 "data_size": 7936 00:22:56.031 }, 00:22:56.031 { 00:22:56.031 "name": "BaseBdev2", 00:22:56.031 "uuid": "955bd8c1-8f50-4eb5-a2cb-850017e92ea9", 00:22:56.031 "is_configured": true, 00:22:56.031 "data_offset": 256, 00:22:56.031 "data_size": 7936 00:22:56.031 } 00:22:56.031 ] 00:22:56.031 }' 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.031 13:51:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:56.600 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:56.600 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:56.600 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.600 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:56.860 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:56.860 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:56.860 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:57.120 [2024-06-10 13:51:11.376631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:57.120 [2024-06-10 13:51:11.376703] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:57.120 [2024-06-10 13:51:11.382898] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:57.120 [2024-06-10 13:51:11.382926] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:57.120 [2024-06-10 13:51:11.382932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107de00 name Existed_Raid, state offline 00:22:57.120 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:57.120 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:57.120 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.120 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1662942 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 1662942 ']' 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 1662942 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1662942 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1662942' 00:22:57.381 killing process with pid 1662942 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # kill 1662942 00:22:57.381 [2024-06-10 13:51:11.654053] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@973 -- # wait 1662942 00:22:57.381 [2024-06-10 13:51:11.654677] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:22:57.381 00:22:57.381 real 0m8.938s 00:22:57.381 user 0m16.195s 00:22:57.381 sys 0m1.406s 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:57.381 13:51:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:57.381 ************************************ 00:22:57.381 END TEST raid_state_function_test_sb_4k 00:22:57.381 ************************************ 00:22:57.381 13:51:11 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:22:57.381 13:51:11 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:22:57.381 13:51:11 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:57.381 13:51:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:57.641 ************************************ 00:22:57.641 START TEST raid_superblock_test_4k 00:22:57.641 ************************************ 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1665505 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1665505 /var/tmp/spdk-raid.sock 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@830 -- # '[' -z 1665505 ']' 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:57.641 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:57.641 13:51:11 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:57.641 [2024-06-10 13:51:11.915744] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:22:57.641 [2024-06-10 13:51:11.915801] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1665505 ] 00:22:57.641 [2024-06-10 13:51:12.008021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:57.641 [2024-06-10 13:51:12.077070] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:22:57.642 [2024-06-10 13:51:12.116356] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:57.642 [2024-06-10 13:51:12.116379] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@863 -- # return 0 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:22:58.582 malloc1 00:22:58.582 13:51:12 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:58.843 [2024-06-10 13:51:13.155621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:58.843 [2024-06-10 13:51:13.155662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:58.843 [2024-06-10 13:51:13.155675] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1886550 00:22:58.843 [2024-06-10 13:51:13.155682] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:58.843 [2024-06-10 13:51:13.157063] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:58.843 [2024-06-10 13:51:13.157085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:58.843 pt1 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:58.843 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:22:59.103 malloc2 00:22:59.103 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:59.103 [2024-06-10 13:51:13.566600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:59.103 [2024-06-10 13:51:13.566629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.103 [2024-06-10 13:51:13.566639] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19480f0 00:22:59.103 [2024-06-10 13:51:13.566646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.103 [2024-06-10 13:51:13.567904] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.103 [2024-06-10 13:51:13.567923] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:59.103 pt2 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:59.364 [2024-06-10 13:51:13.723005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:59.364 [2024-06-10 13:51:13.724040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:59.364 [2024-06-10 13:51:13.724157] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1952690 00:22:59.364 [2024-06-10 13:51:13.724172] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:59.364 [2024-06-10 13:51:13.724323] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x187e0c0 00:22:59.364 [2024-06-10 13:51:13.724437] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1952690 00:22:59.364 [2024-06-10 13:51:13.724443] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1952690 00:22:59.364 [2024-06-10 13:51:13.724518] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.364 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.625 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.625 "name": "raid_bdev1", 00:22:59.625 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:22:59.625 "strip_size_kb": 0, 00:22:59.625 "state": "online", 00:22:59.625 "raid_level": "raid1", 00:22:59.625 "superblock": true, 00:22:59.625 "num_base_bdevs": 2, 00:22:59.625 "num_base_bdevs_discovered": 2, 00:22:59.625 "num_base_bdevs_operational": 2, 00:22:59.625 "base_bdevs_list": [ 00:22:59.625 { 00:22:59.625 "name": "pt1", 00:22:59.625 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:59.625 "is_configured": true, 00:22:59.625 "data_offset": 256, 00:22:59.625 "data_size": 7936 00:22:59.625 }, 00:22:59.625 { 00:22:59.625 "name": "pt2", 00:22:59.625 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:59.625 "is_configured": true, 00:22:59.625 "data_offset": 256, 00:22:59.625 "data_size": 7936 00:22:59.625 } 00:22:59.625 ] 00:22:59.625 }' 00:22:59.625 13:51:13 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.625 13:51:13 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:00.196 [2024-06-10 13:51:14.629515] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:00.196 "name": "raid_bdev1", 00:23:00.196 "aliases": [ 00:23:00.196 "ab341653-660a-4ae7-8ce1-08e79e482c2b" 00:23:00.196 ], 00:23:00.196 "product_name": "Raid Volume", 00:23:00.196 "block_size": 4096, 00:23:00.196 "num_blocks": 7936, 00:23:00.196 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:00.196 "assigned_rate_limits": { 00:23:00.196 "rw_ios_per_sec": 0, 00:23:00.196 "rw_mbytes_per_sec": 0, 00:23:00.196 "r_mbytes_per_sec": 0, 00:23:00.196 "w_mbytes_per_sec": 0 00:23:00.196 }, 00:23:00.196 "claimed": false, 00:23:00.196 "zoned": false, 00:23:00.196 "supported_io_types": { 00:23:00.196 "read": true, 00:23:00.196 "write": true, 00:23:00.196 "unmap": false, 00:23:00.196 "write_zeroes": true, 00:23:00.196 "flush": false, 00:23:00.196 "reset": true, 00:23:00.196 "compare": false, 00:23:00.196 "compare_and_write": false, 00:23:00.196 "abort": false, 00:23:00.196 "nvme_admin": false, 00:23:00.196 "nvme_io": false 00:23:00.196 }, 00:23:00.196 "memory_domains": [ 00:23:00.196 { 00:23:00.196 "dma_device_id": "system", 00:23:00.196 "dma_device_type": 1 00:23:00.196 }, 00:23:00.196 { 00:23:00.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.196 "dma_device_type": 2 00:23:00.196 }, 00:23:00.196 { 00:23:00.196 "dma_device_id": "system", 00:23:00.196 "dma_device_type": 1 00:23:00.196 }, 00:23:00.196 { 00:23:00.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.196 "dma_device_type": 2 00:23:00.196 } 00:23:00.196 ], 00:23:00.196 "driver_specific": { 00:23:00.196 "raid": { 00:23:00.196 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:00.196 "strip_size_kb": 0, 00:23:00.196 "state": "online", 00:23:00.196 "raid_level": "raid1", 00:23:00.196 "superblock": true, 00:23:00.196 "num_base_bdevs": 2, 00:23:00.196 "num_base_bdevs_discovered": 2, 00:23:00.196 "num_base_bdevs_operational": 2, 00:23:00.196 "base_bdevs_list": [ 00:23:00.196 { 00:23:00.196 "name": "pt1", 00:23:00.196 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:00.196 "is_configured": true, 00:23:00.196 "data_offset": 256, 00:23:00.196 "data_size": 7936 00:23:00.196 }, 00:23:00.196 { 00:23:00.196 "name": "pt2", 00:23:00.196 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:00.196 "is_configured": true, 00:23:00.196 "data_offset": 256, 00:23:00.196 "data_size": 7936 00:23:00.196 } 00:23:00.196 ] 00:23:00.196 } 00:23:00.196 } 00:23:00.196 }' 00:23:00.196 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:00.458 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:00.458 pt2' 00:23:00.458 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:00.458 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:00.458 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.458 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.458 "name": "pt1", 00:23:00.458 "aliases": [ 00:23:00.458 "00000000-0000-0000-0000-000000000001" 00:23:00.458 ], 00:23:00.458 "product_name": "passthru", 00:23:00.458 "block_size": 4096, 00:23:00.458 "num_blocks": 8192, 00:23:00.458 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:00.458 "assigned_rate_limits": { 00:23:00.458 "rw_ios_per_sec": 0, 00:23:00.458 "rw_mbytes_per_sec": 0, 00:23:00.458 "r_mbytes_per_sec": 0, 00:23:00.458 "w_mbytes_per_sec": 0 00:23:00.458 }, 00:23:00.458 "claimed": true, 00:23:00.458 "claim_type": "exclusive_write", 00:23:00.458 "zoned": false, 00:23:00.458 "supported_io_types": { 00:23:00.458 "read": true, 00:23:00.458 "write": true, 00:23:00.458 "unmap": true, 00:23:00.458 "write_zeroes": true, 00:23:00.458 "flush": true, 00:23:00.458 "reset": true, 00:23:00.458 "compare": false, 00:23:00.458 "compare_and_write": false, 00:23:00.458 "abort": true, 00:23:00.458 "nvme_admin": false, 00:23:00.458 "nvme_io": false 00:23:00.458 }, 00:23:00.458 "memory_domains": [ 00:23:00.458 { 00:23:00.458 "dma_device_id": "system", 00:23:00.458 "dma_device_type": 1 00:23:00.458 }, 00:23:00.458 { 00:23:00.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.458 "dma_device_type": 2 00:23:00.458 } 00:23:00.458 ], 00:23:00.458 "driver_specific": { 00:23:00.458 "passthru": { 00:23:00.458 "name": "pt1", 00:23:00.458 "base_bdev_name": "malloc1" 00:23:00.458 } 00:23:00.458 } 00:23:00.458 }' 00:23:00.458 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.718 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:00.718 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:00.718 13:51:14 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.718 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:00.718 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:00.718 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.718 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:00.718 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:00.718 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:00.980 "name": "pt2", 00:23:00.980 "aliases": [ 00:23:00.980 "00000000-0000-0000-0000-000000000002" 00:23:00.980 ], 00:23:00.980 "product_name": "passthru", 00:23:00.980 "block_size": 4096, 00:23:00.980 "num_blocks": 8192, 00:23:00.980 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:00.980 "assigned_rate_limits": { 00:23:00.980 "rw_ios_per_sec": 0, 00:23:00.980 "rw_mbytes_per_sec": 0, 00:23:00.980 "r_mbytes_per_sec": 0, 00:23:00.980 "w_mbytes_per_sec": 0 00:23:00.980 }, 00:23:00.980 "claimed": true, 00:23:00.980 "claim_type": "exclusive_write", 00:23:00.980 "zoned": false, 00:23:00.980 "supported_io_types": { 00:23:00.980 "read": true, 00:23:00.980 "write": true, 00:23:00.980 "unmap": true, 00:23:00.980 "write_zeroes": true, 00:23:00.980 "flush": true, 00:23:00.980 "reset": true, 00:23:00.980 "compare": false, 00:23:00.980 "compare_and_write": false, 00:23:00.980 "abort": true, 00:23:00.980 "nvme_admin": false, 00:23:00.980 "nvme_io": false 00:23:00.980 }, 00:23:00.980 "memory_domains": [ 00:23:00.980 { 00:23:00.980 "dma_device_id": "system", 00:23:00.980 "dma_device_type": 1 00:23:00.980 }, 00:23:00.980 { 00:23:00.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:00.980 "dma_device_type": 2 00:23:00.980 } 00:23:00.980 ], 00:23:00.980 "driver_specific": { 00:23:00.980 "passthru": { 00:23:00.980 "name": "pt2", 00:23:00.980 "base_bdev_name": "malloc2" 00:23:00.980 } 00:23:00.980 } 00:23:00.980 }' 00:23:00.980 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:01.242 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.502 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:01.502 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:01.502 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:01.502 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:01.762 [2024-06-10 13:51:15.984957] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:01.762 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ab341653-660a-4ae7-8ce1-08e79e482c2b 00:23:01.763 13:51:15 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z ab341653-660a-4ae7-8ce1-08e79e482c2b ']' 00:23:01.763 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:01.763 [2024-06-10 13:51:16.189294] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:01.763 [2024-06-10 13:51:16.189306] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:01.763 [2024-06-10 13:51:16.189350] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:01.763 [2024-06-10 13:51:16.189394] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:01.763 [2024-06-10 13:51:16.189400] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1952690 name raid_bdev1, state offline 00:23:01.763 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.763 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:02.023 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:02.024 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:02.024 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:02.024 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:02.284 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:02.284 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:02.545 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:02.545 13:51:16 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@649 -- # local es=0 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:02.545 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.805 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:02.805 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.805 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:02.805 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.805 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:02.805 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:02.805 [2024-06-10 13:51:17.207827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:02.805 [2024-06-10 13:51:17.208973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:02.805 [2024-06-10 13:51:17.209018] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:02.805 [2024-06-10 13:51:17.209047] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:02.806 [2024-06-10 13:51:17.209058] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:02.806 [2024-06-10 13:51:17.209063] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1949ec0 name raid_bdev1, state configuring 00:23:02.806 request: 00:23:02.806 { 00:23:02.806 "name": "raid_bdev1", 00:23:02.806 "raid_level": "raid1", 00:23:02.806 "base_bdevs": [ 00:23:02.806 "malloc1", 00:23:02.806 "malloc2" 00:23:02.806 ], 00:23:02.806 "superblock": false, 00:23:02.806 "method": "bdev_raid_create", 00:23:02.806 "req_id": 1 00:23:02.806 } 00:23:02.806 Got JSON-RPC error response 00:23:02.806 response: 00:23:02.806 { 00:23:02.806 "code": -17, 00:23:02.806 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:02.806 } 00:23:02.806 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # es=1 00:23:02.806 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:02.806 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:02.806 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:02.806 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.806 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:03.066 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:03.066 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:03.066 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:03.326 [2024-06-10 13:51:17.616839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:03.326 [2024-06-10 13:51:17.616865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.326 [2024-06-10 13:51:17.616876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18897c0 00:23:03.326 [2024-06-10 13:51:17.616882] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.326 [2024-06-10 13:51:17.618288] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.326 [2024-06-10 13:51:17.618309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:03.326 [2024-06-10 13:51:17.618360] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:03.326 [2024-06-10 13:51:17.618383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:03.326 pt1 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.326 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.585 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:03.585 "name": "raid_bdev1", 00:23:03.585 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:03.585 "strip_size_kb": 0, 00:23:03.585 "state": "configuring", 00:23:03.586 "raid_level": "raid1", 00:23:03.586 "superblock": true, 00:23:03.586 "num_base_bdevs": 2, 00:23:03.586 "num_base_bdevs_discovered": 1, 00:23:03.586 "num_base_bdevs_operational": 2, 00:23:03.586 "base_bdevs_list": [ 00:23:03.586 { 00:23:03.586 "name": "pt1", 00:23:03.586 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:03.586 "is_configured": true, 00:23:03.586 "data_offset": 256, 00:23:03.586 "data_size": 7936 00:23:03.586 }, 00:23:03.586 { 00:23:03.586 "name": null, 00:23:03.586 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:03.586 "is_configured": false, 00:23:03.586 "data_offset": 256, 00:23:03.586 "data_size": 7936 00:23:03.586 } 00:23:03.586 ] 00:23:03.586 }' 00:23:03.586 13:51:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:03.586 13:51:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:04.157 [2024-06-10 13:51:18.555229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:04.157 [2024-06-10 13:51:18.555260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.157 [2024-06-10 13:51:18.555271] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1949a40 00:23:04.157 [2024-06-10 13:51:18.555277] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.157 [2024-06-10 13:51:18.555574] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.157 [2024-06-10 13:51:18.555586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:04.157 [2024-06-10 13:51:18.555628] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:04.157 [2024-06-10 13:51:18.555640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:04.157 [2024-06-10 13:51:18.555717] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1888f30 00:23:04.157 [2024-06-10 13:51:18.555724] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:04.157 [2024-06-10 13:51:18.555864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1887b00 00:23:04.157 [2024-06-10 13:51:18.555975] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1888f30 00:23:04.157 [2024-06-10 13:51:18.555981] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1888f30 00:23:04.157 [2024-06-10 13:51:18.556059] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.157 pt2 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.157 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.417 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.417 "name": "raid_bdev1", 00:23:04.417 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:04.417 "strip_size_kb": 0, 00:23:04.417 "state": "online", 00:23:04.417 "raid_level": "raid1", 00:23:04.417 "superblock": true, 00:23:04.417 "num_base_bdevs": 2, 00:23:04.417 "num_base_bdevs_discovered": 2, 00:23:04.417 "num_base_bdevs_operational": 2, 00:23:04.417 "base_bdevs_list": [ 00:23:04.417 { 00:23:04.417 "name": "pt1", 00:23:04.417 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:04.417 "is_configured": true, 00:23:04.417 "data_offset": 256, 00:23:04.417 "data_size": 7936 00:23:04.417 }, 00:23:04.417 { 00:23:04.417 "name": "pt2", 00:23:04.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:04.417 "is_configured": true, 00:23:04.417 "data_offset": 256, 00:23:04.417 "data_size": 7936 00:23:04.417 } 00:23:04.417 ] 00:23:04.417 }' 00:23:04.417 13:51:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.417 13:51:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:04.987 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:05.247 [2024-06-10 13:51:19.513833] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:05.247 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:05.247 "name": "raid_bdev1", 00:23:05.247 "aliases": [ 00:23:05.247 "ab341653-660a-4ae7-8ce1-08e79e482c2b" 00:23:05.247 ], 00:23:05.247 "product_name": "Raid Volume", 00:23:05.247 "block_size": 4096, 00:23:05.247 "num_blocks": 7936, 00:23:05.247 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:05.247 "assigned_rate_limits": { 00:23:05.247 "rw_ios_per_sec": 0, 00:23:05.247 "rw_mbytes_per_sec": 0, 00:23:05.247 "r_mbytes_per_sec": 0, 00:23:05.247 "w_mbytes_per_sec": 0 00:23:05.247 }, 00:23:05.247 "claimed": false, 00:23:05.247 "zoned": false, 00:23:05.247 "supported_io_types": { 00:23:05.247 "read": true, 00:23:05.247 "write": true, 00:23:05.247 "unmap": false, 00:23:05.247 "write_zeroes": true, 00:23:05.247 "flush": false, 00:23:05.247 "reset": true, 00:23:05.247 "compare": false, 00:23:05.247 "compare_and_write": false, 00:23:05.247 "abort": false, 00:23:05.247 "nvme_admin": false, 00:23:05.247 "nvme_io": false 00:23:05.247 }, 00:23:05.247 "memory_domains": [ 00:23:05.247 { 00:23:05.247 "dma_device_id": "system", 00:23:05.247 "dma_device_type": 1 00:23:05.247 }, 00:23:05.247 { 00:23:05.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.247 "dma_device_type": 2 00:23:05.247 }, 00:23:05.247 { 00:23:05.247 "dma_device_id": "system", 00:23:05.247 "dma_device_type": 1 00:23:05.247 }, 00:23:05.247 { 00:23:05.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.247 "dma_device_type": 2 00:23:05.247 } 00:23:05.247 ], 00:23:05.247 "driver_specific": { 00:23:05.247 "raid": { 00:23:05.247 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:05.247 "strip_size_kb": 0, 00:23:05.247 "state": "online", 00:23:05.247 "raid_level": "raid1", 00:23:05.247 "superblock": true, 00:23:05.247 "num_base_bdevs": 2, 00:23:05.247 "num_base_bdevs_discovered": 2, 00:23:05.247 "num_base_bdevs_operational": 2, 00:23:05.247 "base_bdevs_list": [ 00:23:05.247 { 00:23:05.247 "name": "pt1", 00:23:05.247 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:05.247 "is_configured": true, 00:23:05.247 "data_offset": 256, 00:23:05.247 "data_size": 7936 00:23:05.247 }, 00:23:05.247 { 00:23:05.247 "name": "pt2", 00:23:05.247 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:05.247 "is_configured": true, 00:23:05.247 "data_offset": 256, 00:23:05.247 "data_size": 7936 00:23:05.247 } 00:23:05.247 ] 00:23:05.247 } 00:23:05.247 } 00:23:05.247 }' 00:23:05.247 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:05.247 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:05.247 pt2' 00:23:05.247 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:05.247 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:05.247 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:05.508 "name": "pt1", 00:23:05.508 "aliases": [ 00:23:05.508 "00000000-0000-0000-0000-000000000001" 00:23:05.508 ], 00:23:05.508 "product_name": "passthru", 00:23:05.508 "block_size": 4096, 00:23:05.508 "num_blocks": 8192, 00:23:05.508 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:05.508 "assigned_rate_limits": { 00:23:05.508 "rw_ios_per_sec": 0, 00:23:05.508 "rw_mbytes_per_sec": 0, 00:23:05.508 "r_mbytes_per_sec": 0, 00:23:05.508 "w_mbytes_per_sec": 0 00:23:05.508 }, 00:23:05.508 "claimed": true, 00:23:05.508 "claim_type": "exclusive_write", 00:23:05.508 "zoned": false, 00:23:05.508 "supported_io_types": { 00:23:05.508 "read": true, 00:23:05.508 "write": true, 00:23:05.508 "unmap": true, 00:23:05.508 "write_zeroes": true, 00:23:05.508 "flush": true, 00:23:05.508 "reset": true, 00:23:05.508 "compare": false, 00:23:05.508 "compare_and_write": false, 00:23:05.508 "abort": true, 00:23:05.508 "nvme_admin": false, 00:23:05.508 "nvme_io": false 00:23:05.508 }, 00:23:05.508 "memory_domains": [ 00:23:05.508 { 00:23:05.508 "dma_device_id": "system", 00:23:05.508 "dma_device_type": 1 00:23:05.508 }, 00:23:05.508 { 00:23:05.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.508 "dma_device_type": 2 00:23:05.508 } 00:23:05.508 ], 00:23:05.508 "driver_specific": { 00:23:05.508 "passthru": { 00:23:05.508 "name": "pt1", 00:23:05.508 "base_bdev_name": "malloc1" 00:23:05.508 } 00:23:05.508 } 00:23:05.508 }' 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.508 13:51:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:05.769 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:06.029 "name": "pt2", 00:23:06.029 "aliases": [ 00:23:06.029 "00000000-0000-0000-0000-000000000002" 00:23:06.029 ], 00:23:06.029 "product_name": "passthru", 00:23:06.029 "block_size": 4096, 00:23:06.029 "num_blocks": 8192, 00:23:06.029 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:06.029 "assigned_rate_limits": { 00:23:06.029 "rw_ios_per_sec": 0, 00:23:06.029 "rw_mbytes_per_sec": 0, 00:23:06.029 "r_mbytes_per_sec": 0, 00:23:06.029 "w_mbytes_per_sec": 0 00:23:06.029 }, 00:23:06.029 "claimed": true, 00:23:06.029 "claim_type": "exclusive_write", 00:23:06.029 "zoned": false, 00:23:06.029 "supported_io_types": { 00:23:06.029 "read": true, 00:23:06.029 "write": true, 00:23:06.029 "unmap": true, 00:23:06.029 "write_zeroes": true, 00:23:06.029 "flush": true, 00:23:06.029 "reset": true, 00:23:06.029 "compare": false, 00:23:06.029 "compare_and_write": false, 00:23:06.029 "abort": true, 00:23:06.029 "nvme_admin": false, 00:23:06.029 "nvme_io": false 00:23:06.029 }, 00:23:06.029 "memory_domains": [ 00:23:06.029 { 00:23:06.029 "dma_device_id": "system", 00:23:06.029 "dma_device_type": 1 00:23:06.029 }, 00:23:06.029 { 00:23:06.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.029 "dma_device_type": 2 00:23:06.029 } 00:23:06.029 ], 00:23:06.029 "driver_specific": { 00:23:06.029 "passthru": { 00:23:06.029 "name": "pt2", 00:23:06.029 "base_bdev_name": "malloc2" 00:23:06.029 } 00:23:06.029 } 00:23:06.029 }' 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:06.029 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:06.289 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:06.549 [2024-06-10 13:51:20.833182] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:06.549 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' ab341653-660a-4ae7-8ce1-08e79e482c2b '!=' ab341653-660a-4ae7-8ce1-08e79e482c2b ']' 00:23:06.549 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:06.549 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:06.549 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:23:06.549 13:51:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:06.809 [2024-06-10 13:51:21.033524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.809 "name": "raid_bdev1", 00:23:06.809 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:06.809 "strip_size_kb": 0, 00:23:06.809 "state": "online", 00:23:06.809 "raid_level": "raid1", 00:23:06.809 "superblock": true, 00:23:06.809 "num_base_bdevs": 2, 00:23:06.809 "num_base_bdevs_discovered": 1, 00:23:06.809 "num_base_bdevs_operational": 1, 00:23:06.809 "base_bdevs_list": [ 00:23:06.809 { 00:23:06.809 "name": null, 00:23:06.809 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.809 "is_configured": false, 00:23:06.809 "data_offset": 256, 00:23:06.809 "data_size": 7936 00:23:06.809 }, 00:23:06.809 { 00:23:06.809 "name": "pt2", 00:23:06.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:06.809 "is_configured": true, 00:23:06.809 "data_offset": 256, 00:23:06.809 "data_size": 7936 00:23:06.809 } 00:23:06.809 ] 00:23:06.809 }' 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.809 13:51:21 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:07.404 13:51:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:07.682 [2024-06-10 13:51:22.007965] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:07.682 [2024-06-10 13:51:22.007983] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:07.682 [2024-06-10 13:51:22.008020] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:07.682 [2024-06-10 13:51:22.008051] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:07.682 [2024-06-10 13:51:22.008058] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1888f30 name raid_bdev1, state offline 00:23:07.682 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.682 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:07.941 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:07.941 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:07.941 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:07.941 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:07.941 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:08.202 [2024-06-10 13:51:22.613466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:08.202 [2024-06-10 13:51:22.613497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.202 [2024-06-10 13:51:22.613511] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x187ff20 00:23:08.202 [2024-06-10 13:51:22.613519] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.202 [2024-06-10 13:51:22.614939] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.202 [2024-06-10 13:51:22.614962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:08.202 [2024-06-10 13:51:22.615013] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:08.202 [2024-06-10 13:51:22.615033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:08.202 [2024-06-10 13:51:22.615097] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x187e770 00:23:08.202 [2024-06-10 13:51:22.615103] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:08.202 [2024-06-10 13:51:22.615263] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1886220 00:23:08.202 [2024-06-10 13:51:22.615366] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x187e770 00:23:08.202 [2024-06-10 13:51:22.615372] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x187e770 00:23:08.202 [2024-06-10 13:51:22.615448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.202 pt2 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.202 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.462 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.462 "name": "raid_bdev1", 00:23:08.462 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:08.462 "strip_size_kb": 0, 00:23:08.462 "state": "online", 00:23:08.462 "raid_level": "raid1", 00:23:08.462 "superblock": true, 00:23:08.462 "num_base_bdevs": 2, 00:23:08.462 "num_base_bdevs_discovered": 1, 00:23:08.462 "num_base_bdevs_operational": 1, 00:23:08.462 "base_bdevs_list": [ 00:23:08.462 { 00:23:08.462 "name": null, 00:23:08.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.462 "is_configured": false, 00:23:08.462 "data_offset": 256, 00:23:08.462 "data_size": 7936 00:23:08.462 }, 00:23:08.462 { 00:23:08.462 "name": "pt2", 00:23:08.462 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:08.462 "is_configured": true, 00:23:08.462 "data_offset": 256, 00:23:08.462 "data_size": 7936 00:23:08.462 } 00:23:08.462 ] 00:23:08.462 }' 00:23:08.462 13:51:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.462 13:51:22 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:09.031 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:09.291 [2024-06-10 13:51:23.592019] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:09.291 [2024-06-10 13:51:23.592035] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:09.291 [2024-06-10 13:51:23.592074] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:09.291 [2024-06-10 13:51:23.592106] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:09.291 [2024-06-10 13:51:23.592112] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187e770 name raid_bdev1, state offline 00:23:09.291 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.291 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:09.550 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:09.550 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:09.550 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:23:09.550 13:51:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:09.550 [2024-06-10 13:51:23.997036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:09.550 [2024-06-10 13:51:23.997070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.550 [2024-06-10 13:51:23.997081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1886a10 00:23:09.551 [2024-06-10 13:51:23.997088] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.551 [2024-06-10 13:51:23.998503] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.551 [2024-06-10 13:51:23.998524] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:09.551 [2024-06-10 13:51:23.998578] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:09.551 [2024-06-10 13:51:23.998597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:09.551 [2024-06-10 13:51:23.998676] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:09.551 [2024-06-10 13:51:23.998684] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:09.551 [2024-06-10 13:51:23.998692] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187cea0 name raid_bdev1, state configuring 00:23:09.551 [2024-06-10 13:51:23.998707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:09.551 [2024-06-10 13:51:23.998751] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x187cea0 00:23:09.551 [2024-06-10 13:51:23.998757] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:09.551 [2024-06-10 13:51:23.998907] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1887b00 00:23:09.551 [2024-06-10 13:51:23.999005] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x187cea0 00:23:09.551 [2024-06-10 13:51:23.999011] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x187cea0 00:23:09.551 [2024-06-10 13:51:23.999093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.551 pt1 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.551 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.810 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.810 "name": "raid_bdev1", 00:23:09.810 "uuid": "ab341653-660a-4ae7-8ce1-08e79e482c2b", 00:23:09.810 "strip_size_kb": 0, 00:23:09.810 "state": "online", 00:23:09.810 "raid_level": "raid1", 00:23:09.810 "superblock": true, 00:23:09.810 "num_base_bdevs": 2, 00:23:09.810 "num_base_bdevs_discovered": 1, 00:23:09.810 "num_base_bdevs_operational": 1, 00:23:09.810 "base_bdevs_list": [ 00:23:09.810 { 00:23:09.810 "name": null, 00:23:09.810 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.810 "is_configured": false, 00:23:09.810 "data_offset": 256, 00:23:09.810 "data_size": 7936 00:23:09.810 }, 00:23:09.810 { 00:23:09.810 "name": "pt2", 00:23:09.810 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:09.810 "is_configured": true, 00:23:09.810 "data_offset": 256, 00:23:09.810 "data_size": 7936 00:23:09.810 } 00:23:09.810 ] 00:23:09.810 }' 00:23:09.810 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.810 13:51:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:10.379 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:10.379 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:10.639 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:10.639 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:10.639 13:51:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:10.899 [2024-06-10 13:51:25.164156] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' ab341653-660a-4ae7-8ce1-08e79e482c2b '!=' ab341653-660a-4ae7-8ce1-08e79e482c2b ']' 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1665505 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@949 -- # '[' -z 1665505 ']' 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # kill -0 1665505 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # uname 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1665505 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1665505' 00:23:10.899 killing process with pid 1665505 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # kill 1665505 00:23:10.899 [2024-06-10 13:51:25.234742] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:10.899 [2024-06-10 13:51:25.234783] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:10.899 [2024-06-10 13:51:25.234817] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:10.899 [2024-06-10 13:51:25.234823] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187cea0 name raid_bdev1, state offline 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@973 -- # wait 1665505 00:23:10.899 [2024-06-10 13:51:25.244487] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:23:10.899 00:23:10.899 real 0m13.511s 00:23:10.899 user 0m25.006s 00:23:10.899 sys 0m2.013s 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:10.899 13:51:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:23:10.899 ************************************ 00:23:10.899 END TEST raid_superblock_test_4k 00:23:10.899 ************************************ 00:23:11.160 13:51:25 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:23:11.160 13:51:25 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:23:11.160 13:51:25 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:23:11.160 13:51:25 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:11.160 13:51:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:11.160 ************************************ 00:23:11.160 START TEST raid_rebuild_test_sb_4k 00:23:11.160 ************************************ 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1668416 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1668416 /var/tmp/spdk-raid.sock 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 1668416 ']' 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:11.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:11.160 13:51:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:11.160 [2024-06-10 13:51:25.514196] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:23:11.160 [2024-06-10 13:51:25.514243] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1668416 ] 00:23:11.160 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:11.160 Zero copy mechanism will not be used. 00:23:11.160 [2024-06-10 13:51:25.602514] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:11.420 [2024-06-10 13:51:25.667498] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:23:11.420 [2024-06-10 13:51:25.707209] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:11.420 [2024-06-10 13:51:25.707233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:11.989 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:11.989 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:23:11.989 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:11.989 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:23:12.249 BaseBdev1_malloc 00:23:12.249 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:12.509 [2024-06-10 13:51:26.754237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:12.509 [2024-06-10 13:51:26.754270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.509 [2024-06-10 13:51:26.754287] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ef900 00:23:12.509 [2024-06-10 13:51:26.754294] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.509 [2024-06-10 13:51:26.755735] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.509 [2024-06-10 13:51:26.755756] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:12.509 BaseBdev1 00:23:12.509 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:12.509 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:23:12.509 BaseBdev2_malloc 00:23:12.509 13:51:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:12.770 [2024-06-10 13:51:27.157493] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:12.770 [2024-06-10 13:51:27.157523] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:12.770 [2024-06-10 13:51:27.157536] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15f09c0 00:23:12.770 [2024-06-10 13:51:27.157543] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:12.770 [2024-06-10 13:51:27.158815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:12.770 [2024-06-10 13:51:27.158833] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:12.770 BaseBdev2 00:23:12.770 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:23:13.030 spare_malloc 00:23:13.030 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:13.290 spare_delay 00:23:13.290 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:13.290 [2024-06-10 13:51:27.761049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:13.290 [2024-06-10 13:51:27.761077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.290 [2024-06-10 13:51:27.761088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179e6b0 00:23:13.290 [2024-06-10 13:51:27.761095] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.290 [2024-06-10 13:51:27.762366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.290 [2024-06-10 13:51:27.762385] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:13.549 spare 00:23:13.549 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:13.549 [2024-06-10 13:51:27.961579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:13.550 [2024-06-10 13:51:27.962644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:13.550 [2024-06-10 13:51:27.962770] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x179fd20 00:23:13.550 [2024-06-10 13:51:27.962779] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:13.550 [2024-06-10 13:51:27.962940] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ef5d0 00:23:13.550 [2024-06-10 13:51:27.963053] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x179fd20 00:23:13.550 [2024-06-10 13:51:27.963059] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x179fd20 00:23:13.550 [2024-06-10 13:51:27.963134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.550 13:51:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.810 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.810 "name": "raid_bdev1", 00:23:13.810 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:13.810 "strip_size_kb": 0, 00:23:13.810 "state": "online", 00:23:13.810 "raid_level": "raid1", 00:23:13.810 "superblock": true, 00:23:13.810 "num_base_bdevs": 2, 00:23:13.810 "num_base_bdevs_discovered": 2, 00:23:13.810 "num_base_bdevs_operational": 2, 00:23:13.810 "base_bdevs_list": [ 00:23:13.810 { 00:23:13.810 "name": "BaseBdev1", 00:23:13.810 "uuid": "83529e63-135b-5bc9-a9be-e05bb885f5bf", 00:23:13.810 "is_configured": true, 00:23:13.810 "data_offset": 256, 00:23:13.810 "data_size": 7936 00:23:13.810 }, 00:23:13.810 { 00:23:13.810 "name": "BaseBdev2", 00:23:13.810 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:13.810 "is_configured": true, 00:23:13.810 "data_offset": 256, 00:23:13.810 "data_size": 7936 00:23:13.810 } 00:23:13.810 ] 00:23:13.810 }' 00:23:13.810 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.810 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:14.379 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:14.379 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:14.639 [2024-06-10 13:51:28.936234] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:14.639 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:23:14.639 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.639 13:51:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:14.906 [2024-06-10 13:51:29.349116] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ef5d0 00:23:14.906 /dev/nbd0 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:14.906 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:15.171 1+0 records in 00:23:15.171 1+0 records out 00:23:15.171 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267697 s, 15.3 MB/s 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:15.171 13:51:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:23:15.739 7936+0 records in 00:23:15.739 7936+0 records out 00:23:15.739 32505856 bytes (33 MB, 31 MiB) copied, 0.666268 s, 48.8 MB/s 00:23:15.739 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:15.739 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:15.739 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:15.739 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:15.739 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:23:15.739 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:15.740 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:15.999 [2024-06-10 13:51:30.290777] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:15.999 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:16.259 [2024-06-10 13:51:30.479285] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.259 "name": "raid_bdev1", 00:23:16.259 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:16.259 "strip_size_kb": 0, 00:23:16.259 "state": "online", 00:23:16.259 "raid_level": "raid1", 00:23:16.259 "superblock": true, 00:23:16.259 "num_base_bdevs": 2, 00:23:16.259 "num_base_bdevs_discovered": 1, 00:23:16.259 "num_base_bdevs_operational": 1, 00:23:16.259 "base_bdevs_list": [ 00:23:16.259 { 00:23:16.259 "name": null, 00:23:16.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.259 "is_configured": false, 00:23:16.259 "data_offset": 256, 00:23:16.259 "data_size": 7936 00:23:16.259 }, 00:23:16.259 { 00:23:16.259 "name": "BaseBdev2", 00:23:16.259 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:16.259 "is_configured": true, 00:23:16.259 "data_offset": 256, 00:23:16.259 "data_size": 7936 00:23:16.259 } 00:23:16.259 ] 00:23:16.259 }' 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.259 13:51:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:16.828 13:51:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:17.088 [2024-06-10 13:51:31.413657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:17.088 [2024-06-10 13:51:31.417136] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x179dec0 00:23:17.088 [2024-06-10 13:51:31.418790] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:17.088 13:51:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.028 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.290 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:18.290 "name": "raid_bdev1", 00:23:18.290 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:18.290 "strip_size_kb": 0, 00:23:18.290 "state": "online", 00:23:18.290 "raid_level": "raid1", 00:23:18.290 "superblock": true, 00:23:18.290 "num_base_bdevs": 2, 00:23:18.290 "num_base_bdevs_discovered": 2, 00:23:18.290 "num_base_bdevs_operational": 2, 00:23:18.290 "process": { 00:23:18.290 "type": "rebuild", 00:23:18.290 "target": "spare", 00:23:18.290 "progress": { 00:23:18.290 "blocks": 3072, 00:23:18.290 "percent": 38 00:23:18.290 } 00:23:18.290 }, 00:23:18.290 "base_bdevs_list": [ 00:23:18.290 { 00:23:18.290 "name": "spare", 00:23:18.290 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:18.290 "is_configured": true, 00:23:18.290 "data_offset": 256, 00:23:18.290 "data_size": 7936 00:23:18.290 }, 00:23:18.290 { 00:23:18.290 "name": "BaseBdev2", 00:23:18.290 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:18.290 "is_configured": true, 00:23:18.290 "data_offset": 256, 00:23:18.290 "data_size": 7936 00:23:18.290 } 00:23:18.290 ] 00:23:18.290 }' 00:23:18.290 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:18.290 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:18.290 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.290 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:18.290 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:18.551 [2024-06-10 13:51:32.919276] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:18.551 [2024-06-10 13:51:32.928073] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:18.551 [2024-06-10 13:51:32.928104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.551 [2024-06-10 13:51:32.928114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:18.551 [2024-06-10 13:51:32.928118] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.551 13:51:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.812 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.812 "name": "raid_bdev1", 00:23:18.812 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:18.812 "strip_size_kb": 0, 00:23:18.812 "state": "online", 00:23:18.812 "raid_level": "raid1", 00:23:18.812 "superblock": true, 00:23:18.812 "num_base_bdevs": 2, 00:23:18.812 "num_base_bdevs_discovered": 1, 00:23:18.812 "num_base_bdevs_operational": 1, 00:23:18.812 "base_bdevs_list": [ 00:23:18.812 { 00:23:18.812 "name": null, 00:23:18.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.812 "is_configured": false, 00:23:18.812 "data_offset": 256, 00:23:18.812 "data_size": 7936 00:23:18.812 }, 00:23:18.812 { 00:23:18.812 "name": "BaseBdev2", 00:23:18.812 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:18.812 "is_configured": true, 00:23:18.812 "data_offset": 256, 00:23:18.812 "data_size": 7936 00:23:18.812 } 00:23:18.812 ] 00:23:18.812 }' 00:23:18.812 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.812 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.382 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.643 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:19.643 "name": "raid_bdev1", 00:23:19.643 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:19.643 "strip_size_kb": 0, 00:23:19.643 "state": "online", 00:23:19.643 "raid_level": "raid1", 00:23:19.643 "superblock": true, 00:23:19.643 "num_base_bdevs": 2, 00:23:19.643 "num_base_bdevs_discovered": 1, 00:23:19.643 "num_base_bdevs_operational": 1, 00:23:19.643 "base_bdevs_list": [ 00:23:19.643 { 00:23:19.643 "name": null, 00:23:19.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.643 "is_configured": false, 00:23:19.643 "data_offset": 256, 00:23:19.643 "data_size": 7936 00:23:19.643 }, 00:23:19.643 { 00:23:19.643 "name": "BaseBdev2", 00:23:19.643 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:19.643 "is_configured": true, 00:23:19.643 "data_offset": 256, 00:23:19.643 "data_size": 7936 00:23:19.643 } 00:23:19.643 ] 00:23:19.643 }' 00:23:19.643 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:19.643 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:19.643 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:19.643 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:19.643 13:51:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:19.903 [2024-06-10 13:51:34.186571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:19.903 [2024-06-10 13:51:34.190038] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a1200 00:23:19.903 [2024-06-10 13:51:34.191268] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:19.903 13:51:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.843 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.105 "name": "raid_bdev1", 00:23:21.105 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:21.105 "strip_size_kb": 0, 00:23:21.105 "state": "online", 00:23:21.105 "raid_level": "raid1", 00:23:21.105 "superblock": true, 00:23:21.105 "num_base_bdevs": 2, 00:23:21.105 "num_base_bdevs_discovered": 2, 00:23:21.105 "num_base_bdevs_operational": 2, 00:23:21.105 "process": { 00:23:21.105 "type": "rebuild", 00:23:21.105 "target": "spare", 00:23:21.105 "progress": { 00:23:21.105 "blocks": 2816, 00:23:21.105 "percent": 35 00:23:21.105 } 00:23:21.105 }, 00:23:21.105 "base_bdevs_list": [ 00:23:21.105 { 00:23:21.105 "name": "spare", 00:23:21.105 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:21.105 "is_configured": true, 00:23:21.105 "data_offset": 256, 00:23:21.105 "data_size": 7936 00:23:21.105 }, 00:23:21.105 { 00:23:21.105 "name": "BaseBdev2", 00:23:21.105 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:21.105 "is_configured": true, 00:23:21.105 "data_offset": 256, 00:23:21.105 "data_size": 7936 00:23:21.105 } 00:23:21.105 ] 00:23:21.105 }' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:21.105 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=883 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.105 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.366 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.366 "name": "raid_bdev1", 00:23:21.366 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:21.366 "strip_size_kb": 0, 00:23:21.366 "state": "online", 00:23:21.366 "raid_level": "raid1", 00:23:21.366 "superblock": true, 00:23:21.366 "num_base_bdevs": 2, 00:23:21.366 "num_base_bdevs_discovered": 2, 00:23:21.366 "num_base_bdevs_operational": 2, 00:23:21.366 "process": { 00:23:21.366 "type": "rebuild", 00:23:21.366 "target": "spare", 00:23:21.366 "progress": { 00:23:21.366 "blocks": 3584, 00:23:21.366 "percent": 45 00:23:21.366 } 00:23:21.366 }, 00:23:21.366 "base_bdevs_list": [ 00:23:21.366 { 00:23:21.366 "name": "spare", 00:23:21.366 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:21.366 "is_configured": true, 00:23:21.366 "data_offset": 256, 00:23:21.366 "data_size": 7936 00:23:21.366 }, 00:23:21.366 { 00:23:21.366 "name": "BaseBdev2", 00:23:21.366 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:21.366 "is_configured": true, 00:23:21.366 "data_offset": 256, 00:23:21.366 "data_size": 7936 00:23:21.366 } 00:23:21.366 ] 00:23:21.366 }' 00:23:21.366 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.366 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:21.366 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.366 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:21.366 13:51:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:22.309 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:22.309 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:22.309 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:22.309 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:22.309 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:22.309 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:22.569 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.569 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.569 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:22.569 "name": "raid_bdev1", 00:23:22.569 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:22.569 "strip_size_kb": 0, 00:23:22.569 "state": "online", 00:23:22.569 "raid_level": "raid1", 00:23:22.569 "superblock": true, 00:23:22.569 "num_base_bdevs": 2, 00:23:22.569 "num_base_bdevs_discovered": 2, 00:23:22.569 "num_base_bdevs_operational": 2, 00:23:22.569 "process": { 00:23:22.569 "type": "rebuild", 00:23:22.569 "target": "spare", 00:23:22.569 "progress": { 00:23:22.569 "blocks": 6912, 00:23:22.569 "percent": 87 00:23:22.569 } 00:23:22.569 }, 00:23:22.569 "base_bdevs_list": [ 00:23:22.569 { 00:23:22.569 "name": "spare", 00:23:22.569 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:22.569 "is_configured": true, 00:23:22.569 "data_offset": 256, 00:23:22.569 "data_size": 7936 00:23:22.569 }, 00:23:22.569 { 00:23:22.569 "name": "BaseBdev2", 00:23:22.569 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:22.569 "is_configured": true, 00:23:22.569 "data_offset": 256, 00:23:22.569 "data_size": 7936 00:23:22.569 } 00:23:22.569 ] 00:23:22.569 }' 00:23:22.569 13:51:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:22.569 13:51:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:22.569 13:51:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:22.829 13:51:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:22.830 13:51:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:23.090 [2024-06-10 13:51:37.309613] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:23.090 [2024-06-10 13:51:37.309658] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:23.090 [2024-06-10 13:51:37.309722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.665 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:23.926 "name": "raid_bdev1", 00:23:23.926 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:23.926 "strip_size_kb": 0, 00:23:23.926 "state": "online", 00:23:23.926 "raid_level": "raid1", 00:23:23.926 "superblock": true, 00:23:23.926 "num_base_bdevs": 2, 00:23:23.926 "num_base_bdevs_discovered": 2, 00:23:23.926 "num_base_bdevs_operational": 2, 00:23:23.926 "base_bdevs_list": [ 00:23:23.926 { 00:23:23.926 "name": "spare", 00:23:23.926 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:23.926 "is_configured": true, 00:23:23.926 "data_offset": 256, 00:23:23.926 "data_size": 7936 00:23:23.926 }, 00:23:23.926 { 00:23:23.926 "name": "BaseBdev2", 00:23:23.926 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:23.926 "is_configured": true, 00:23:23.926 "data_offset": 256, 00:23:23.926 "data_size": 7936 00:23:23.926 } 00:23:23.926 ] 00:23:23.926 }' 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.926 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.187 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:24.187 "name": "raid_bdev1", 00:23:24.187 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:24.187 "strip_size_kb": 0, 00:23:24.187 "state": "online", 00:23:24.187 "raid_level": "raid1", 00:23:24.187 "superblock": true, 00:23:24.187 "num_base_bdevs": 2, 00:23:24.187 "num_base_bdevs_discovered": 2, 00:23:24.187 "num_base_bdevs_operational": 2, 00:23:24.187 "base_bdevs_list": [ 00:23:24.187 { 00:23:24.187 "name": "spare", 00:23:24.187 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:24.187 "is_configured": true, 00:23:24.187 "data_offset": 256, 00:23:24.187 "data_size": 7936 00:23:24.187 }, 00:23:24.187 { 00:23:24.187 "name": "BaseBdev2", 00:23:24.187 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:24.187 "is_configured": true, 00:23:24.187 "data_offset": 256, 00:23:24.187 "data_size": 7936 00:23:24.187 } 00:23:24.187 ] 00:23:24.187 }' 00:23:24.187 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:24.187 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:24.187 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.449 "name": "raid_bdev1", 00:23:24.449 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:24.449 "strip_size_kb": 0, 00:23:24.449 "state": "online", 00:23:24.449 "raid_level": "raid1", 00:23:24.449 "superblock": true, 00:23:24.449 "num_base_bdevs": 2, 00:23:24.449 "num_base_bdevs_discovered": 2, 00:23:24.449 "num_base_bdevs_operational": 2, 00:23:24.449 "base_bdevs_list": [ 00:23:24.449 { 00:23:24.449 "name": "spare", 00:23:24.449 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:24.449 "is_configured": true, 00:23:24.449 "data_offset": 256, 00:23:24.449 "data_size": 7936 00:23:24.449 }, 00:23:24.449 { 00:23:24.449 "name": "BaseBdev2", 00:23:24.449 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:24.449 "is_configured": true, 00:23:24.449 "data_offset": 256, 00:23:24.449 "data_size": 7936 00:23:24.449 } 00:23:24.449 ] 00:23:24.449 }' 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.449 13:51:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:25.020 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:25.280 [2024-06-10 13:51:39.622629] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:25.280 [2024-06-10 13:51:39.622647] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:25.280 [2024-06-10 13:51:39.622695] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:25.280 [2024-06-10 13:51:39.622742] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:25.280 [2024-06-10 13:51:39.622749] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179fd20 name raid_bdev1, state offline 00:23:25.280 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.280 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:25.540 13:51:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:25.800 /dev/nbd0 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:25.800 1+0 records in 00:23:25.800 1+0 records out 00:23:25.800 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024524 s, 16.7 MB/s 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:25.800 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:26.061 /dev/nbd1 00:23:26.061 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:26.061 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:26.061 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:23:26.061 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:23:26.061 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:26.062 1+0 records in 00:23:26.062 1+0 records out 00:23:26.062 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267809 s, 15.3 MB/s 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:26.062 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:26.323 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:26.584 13:51:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:26.844 [2024-06-10 13:51:41.253330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:26.844 [2024-06-10 13:51:41.253364] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.844 [2024-06-10 13:51:41.253379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179a2a0 00:23:26.844 [2024-06-10 13:51:41.253386] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.844 [2024-06-10 13:51:41.254770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.844 [2024-06-10 13:51:41.254791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:26.844 [2024-06-10 13:51:41.254852] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:26.844 [2024-06-10 13:51:41.254872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:26.844 [2024-06-10 13:51:41.254953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:26.844 spare 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.844 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.104 [2024-06-10 13:51:41.355246] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15f4a80 00:23:27.104 [2024-06-10 13:51:41.355256] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:27.104 [2024-06-10 13:51:41.355412] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x179ded0 00:23:27.104 [2024-06-10 13:51:41.355528] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15f4a80 00:23:27.104 [2024-06-10 13:51:41.355534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15f4a80 00:23:27.104 [2024-06-10 13:51:41.355612] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.104 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.104 "name": "raid_bdev1", 00:23:27.104 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:27.104 "strip_size_kb": 0, 00:23:27.104 "state": "online", 00:23:27.104 "raid_level": "raid1", 00:23:27.104 "superblock": true, 00:23:27.104 "num_base_bdevs": 2, 00:23:27.104 "num_base_bdevs_discovered": 2, 00:23:27.104 "num_base_bdevs_operational": 2, 00:23:27.104 "base_bdevs_list": [ 00:23:27.104 { 00:23:27.104 "name": "spare", 00:23:27.104 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:27.104 "is_configured": true, 00:23:27.104 "data_offset": 256, 00:23:27.104 "data_size": 7936 00:23:27.104 }, 00:23:27.104 { 00:23:27.104 "name": "BaseBdev2", 00:23:27.104 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:27.104 "is_configured": true, 00:23:27.104 "data_offset": 256, 00:23:27.104 "data_size": 7936 00:23:27.104 } 00:23:27.104 ] 00:23:27.104 }' 00:23:27.104 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.104 13:51:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.675 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.934 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:27.934 "name": "raid_bdev1", 00:23:27.935 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:27.935 "strip_size_kb": 0, 00:23:27.935 "state": "online", 00:23:27.935 "raid_level": "raid1", 00:23:27.935 "superblock": true, 00:23:27.935 "num_base_bdevs": 2, 00:23:27.935 "num_base_bdevs_discovered": 2, 00:23:27.935 "num_base_bdevs_operational": 2, 00:23:27.935 "base_bdevs_list": [ 00:23:27.935 { 00:23:27.935 "name": "spare", 00:23:27.935 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:27.935 "is_configured": true, 00:23:27.935 "data_offset": 256, 00:23:27.935 "data_size": 7936 00:23:27.935 }, 00:23:27.935 { 00:23:27.935 "name": "BaseBdev2", 00:23:27.935 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:27.935 "is_configured": true, 00:23:27.935 "data_offset": 256, 00:23:27.935 "data_size": 7936 00:23:27.935 } 00:23:27.935 ] 00:23:27.935 }' 00:23:27.935 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:27.935 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:27.935 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:27.935 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:27.935 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.935 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:28.196 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:28.196 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:28.457 [2024-06-10 13:51:42.737193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.457 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.717 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.717 "name": "raid_bdev1", 00:23:28.717 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:28.717 "strip_size_kb": 0, 00:23:28.717 "state": "online", 00:23:28.717 "raid_level": "raid1", 00:23:28.717 "superblock": true, 00:23:28.717 "num_base_bdevs": 2, 00:23:28.717 "num_base_bdevs_discovered": 1, 00:23:28.717 "num_base_bdevs_operational": 1, 00:23:28.717 "base_bdevs_list": [ 00:23:28.717 { 00:23:28.717 "name": null, 00:23:28.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.717 "is_configured": false, 00:23:28.717 "data_offset": 256, 00:23:28.717 "data_size": 7936 00:23:28.717 }, 00:23:28.717 { 00:23:28.717 "name": "BaseBdev2", 00:23:28.717 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:28.717 "is_configured": true, 00:23:28.717 "data_offset": 256, 00:23:28.717 "data_size": 7936 00:23:28.717 } 00:23:28.717 ] 00:23:28.717 }' 00:23:28.717 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.717 13:51:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:29.287 13:51:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:29.287 [2024-06-10 13:51:43.707654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.287 [2024-06-10 13:51:43.707769] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:29.287 [2024-06-10 13:51:43.707778] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:29.287 [2024-06-10 13:51:43.707797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.287 [2024-06-10 13:51:43.711187] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15f40c0 00:23:29.287 [2024-06-10 13:51:43.712912] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:29.287 13:51:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.673 "name": "raid_bdev1", 00:23:30.673 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:30.673 "strip_size_kb": 0, 00:23:30.673 "state": "online", 00:23:30.673 "raid_level": "raid1", 00:23:30.673 "superblock": true, 00:23:30.673 "num_base_bdevs": 2, 00:23:30.673 "num_base_bdevs_discovered": 2, 00:23:30.673 "num_base_bdevs_operational": 2, 00:23:30.673 "process": { 00:23:30.673 "type": "rebuild", 00:23:30.673 "target": "spare", 00:23:30.673 "progress": { 00:23:30.673 "blocks": 3072, 00:23:30.673 "percent": 38 00:23:30.673 } 00:23:30.673 }, 00:23:30.673 "base_bdevs_list": [ 00:23:30.673 { 00:23:30.673 "name": "spare", 00:23:30.673 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:30.673 "is_configured": true, 00:23:30.673 "data_offset": 256, 00:23:30.673 "data_size": 7936 00:23:30.673 }, 00:23:30.673 { 00:23:30.673 "name": "BaseBdev2", 00:23:30.673 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:30.673 "is_configured": true, 00:23:30.673 "data_offset": 256, 00:23:30.673 "data_size": 7936 00:23:30.673 } 00:23:30.673 ] 00:23:30.673 }' 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:30.673 13:51:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:30.673 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:30.673 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:30.933 [2024-06-10 13:51:45.205329] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:30.933 [2024-06-10 13:51:45.221974] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:30.933 [2024-06-10 13:51:45.222007] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.933 [2024-06-10 13:51:45.222017] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:30.933 [2024-06-10 13:51:45.222021] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.933 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.193 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.193 "name": "raid_bdev1", 00:23:31.193 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:31.193 "strip_size_kb": 0, 00:23:31.193 "state": "online", 00:23:31.193 "raid_level": "raid1", 00:23:31.193 "superblock": true, 00:23:31.193 "num_base_bdevs": 2, 00:23:31.193 "num_base_bdevs_discovered": 1, 00:23:31.193 "num_base_bdevs_operational": 1, 00:23:31.193 "base_bdevs_list": [ 00:23:31.193 { 00:23:31.193 "name": null, 00:23:31.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.193 "is_configured": false, 00:23:31.193 "data_offset": 256, 00:23:31.193 "data_size": 7936 00:23:31.193 }, 00:23:31.193 { 00:23:31.193 "name": "BaseBdev2", 00:23:31.193 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:31.193 "is_configured": true, 00:23:31.193 "data_offset": 256, 00:23:31.193 "data_size": 7936 00:23:31.193 } 00:23:31.193 ] 00:23:31.193 }' 00:23:31.193 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.193 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:31.764 13:51:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:31.764 [2024-06-10 13:51:46.172243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:31.764 [2024-06-10 13:51:46.172280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:31.764 [2024-06-10 13:51:46.172298] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ee200 00:23:31.764 [2024-06-10 13:51:46.172304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:31.764 [2024-06-10 13:51:46.172627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:31.764 [2024-06-10 13:51:46.172644] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:31.764 [2024-06-10 13:51:46.172704] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:31.764 [2024-06-10 13:51:46.172711] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:31.764 [2024-06-10 13:51:46.172717] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:31.764 [2024-06-10 13:51:46.172729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:31.764 [2024-06-10 13:51:46.176102] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15f4130 00:23:31.764 [2024-06-10 13:51:46.177337] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:31.764 spare 00:23:31.764 13:51:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:32.847 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.847 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.848 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:32.848 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:32.848 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.848 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.848 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.108 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.108 "name": "raid_bdev1", 00:23:33.108 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:33.108 "strip_size_kb": 0, 00:23:33.108 "state": "online", 00:23:33.108 "raid_level": "raid1", 00:23:33.108 "superblock": true, 00:23:33.108 "num_base_bdevs": 2, 00:23:33.108 "num_base_bdevs_discovered": 2, 00:23:33.108 "num_base_bdevs_operational": 2, 00:23:33.108 "process": { 00:23:33.108 "type": "rebuild", 00:23:33.108 "target": "spare", 00:23:33.108 "progress": { 00:23:33.108 "blocks": 2816, 00:23:33.108 "percent": 35 00:23:33.108 } 00:23:33.108 }, 00:23:33.108 "base_bdevs_list": [ 00:23:33.108 { 00:23:33.108 "name": "spare", 00:23:33.108 "uuid": "afe2518f-3629-5a29-b4c8-df318766aa97", 00:23:33.108 "is_configured": true, 00:23:33.108 "data_offset": 256, 00:23:33.108 "data_size": 7936 00:23:33.108 }, 00:23:33.108 { 00:23:33.108 "name": "BaseBdev2", 00:23:33.108 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:33.108 "is_configured": true, 00:23:33.108 "data_offset": 256, 00:23:33.108 "data_size": 7936 00:23:33.108 } 00:23:33.108 ] 00:23:33.108 }' 00:23:33.108 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.108 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.108 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.108 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.108 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:33.368 [2024-06-10 13:51:47.658247] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.368 [2024-06-10 13:51:47.686285] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:33.368 [2024-06-10 13:51:47.686315] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:33.368 [2024-06-10 13:51:47.686326] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.368 [2024-06-10 13:51:47.686331] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.368 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.627 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.627 "name": "raid_bdev1", 00:23:33.627 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:33.627 "strip_size_kb": 0, 00:23:33.627 "state": "online", 00:23:33.628 "raid_level": "raid1", 00:23:33.628 "superblock": true, 00:23:33.628 "num_base_bdevs": 2, 00:23:33.628 "num_base_bdevs_discovered": 1, 00:23:33.628 "num_base_bdevs_operational": 1, 00:23:33.628 "base_bdevs_list": [ 00:23:33.628 { 00:23:33.628 "name": null, 00:23:33.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.628 "is_configured": false, 00:23:33.628 "data_offset": 256, 00:23:33.628 "data_size": 7936 00:23:33.628 }, 00:23:33.628 { 00:23:33.628 "name": "BaseBdev2", 00:23:33.628 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:33.628 "is_configured": true, 00:23:33.628 "data_offset": 256, 00:23:33.628 "data_size": 7936 00:23:33.628 } 00:23:33.628 ] 00:23:33.628 }' 00:23:33.628 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.628 13:51:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.198 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.458 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:34.458 "name": "raid_bdev1", 00:23:34.458 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:34.458 "strip_size_kb": 0, 00:23:34.458 "state": "online", 00:23:34.458 "raid_level": "raid1", 00:23:34.458 "superblock": true, 00:23:34.458 "num_base_bdevs": 2, 00:23:34.458 "num_base_bdevs_discovered": 1, 00:23:34.458 "num_base_bdevs_operational": 1, 00:23:34.458 "base_bdevs_list": [ 00:23:34.458 { 00:23:34.458 "name": null, 00:23:34.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.458 "is_configured": false, 00:23:34.458 "data_offset": 256, 00:23:34.458 "data_size": 7936 00:23:34.458 }, 00:23:34.458 { 00:23:34.458 "name": "BaseBdev2", 00:23:34.458 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:34.458 "is_configured": true, 00:23:34.458 "data_offset": 256, 00:23:34.458 "data_size": 7936 00:23:34.458 } 00:23:34.458 ] 00:23:34.458 }' 00:23:34.458 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:34.458 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:34.458 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:34.458 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:34.458 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:34.718 13:51:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:34.718 [2024-06-10 13:51:49.154036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:34.718 [2024-06-10 13:51:49.154065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.718 [2024-06-10 13:51:49.154080] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x179a4e0 00:23:34.718 [2024-06-10 13:51:49.154087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.718 [2024-06-10 13:51:49.154386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.718 [2024-06-10 13:51:49.154399] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:34.718 [2024-06-10 13:51:49.154445] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:34.718 [2024-06-10 13:51:49.154452] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:34.718 [2024-06-10 13:51:49.154458] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:34.718 BaseBdev1 00:23:34.718 13:51:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.099 "name": "raid_bdev1", 00:23:36.099 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:36.099 "strip_size_kb": 0, 00:23:36.099 "state": "online", 00:23:36.099 "raid_level": "raid1", 00:23:36.099 "superblock": true, 00:23:36.099 "num_base_bdevs": 2, 00:23:36.099 "num_base_bdevs_discovered": 1, 00:23:36.099 "num_base_bdevs_operational": 1, 00:23:36.099 "base_bdevs_list": [ 00:23:36.099 { 00:23:36.099 "name": null, 00:23:36.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.099 "is_configured": false, 00:23:36.099 "data_offset": 256, 00:23:36.099 "data_size": 7936 00:23:36.099 }, 00:23:36.099 { 00:23:36.099 "name": "BaseBdev2", 00:23:36.099 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:36.099 "is_configured": true, 00:23:36.099 "data_offset": 256, 00:23:36.099 "data_size": 7936 00:23:36.099 } 00:23:36.099 ] 00:23:36.099 }' 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.099 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:36.669 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:36.670 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.670 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:36.670 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:36.670 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.670 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.670 13:51:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.930 "name": "raid_bdev1", 00:23:36.930 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:36.930 "strip_size_kb": 0, 00:23:36.930 "state": "online", 00:23:36.930 "raid_level": "raid1", 00:23:36.930 "superblock": true, 00:23:36.930 "num_base_bdevs": 2, 00:23:36.930 "num_base_bdevs_discovered": 1, 00:23:36.930 "num_base_bdevs_operational": 1, 00:23:36.930 "base_bdevs_list": [ 00:23:36.930 { 00:23:36.930 "name": null, 00:23:36.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.930 "is_configured": false, 00:23:36.930 "data_offset": 256, 00:23:36.930 "data_size": 7936 00:23:36.930 }, 00:23:36.930 { 00:23:36.930 "name": "BaseBdev2", 00:23:36.930 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:36.930 "is_configured": true, 00:23:36.930 "data_offset": 256, 00:23:36.930 "data_size": 7936 00:23:36.930 } 00:23:36.930 ] 00:23:36.930 }' 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@649 -- # local es=0 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:36.930 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:37.190 [2024-06-10 13:51:51.431851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:37.190 [2024-06-10 13:51:51.431956] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:37.190 [2024-06-10 13:51:51.431965] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:37.190 request: 00:23:37.190 { 00:23:37.190 "raid_bdev": "raid_bdev1", 00:23:37.190 "base_bdev": "BaseBdev1", 00:23:37.190 "method": "bdev_raid_add_base_bdev", 00:23:37.190 "req_id": 1 00:23:37.190 } 00:23:37.190 Got JSON-RPC error response 00:23:37.190 response: 00:23:37.190 { 00:23:37.190 "code": -22, 00:23:37.190 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:37.190 } 00:23:37.190 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # es=1 00:23:37.190 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:37.190 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:37.190 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:37.190 13:51:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.129 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.130 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.130 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.130 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.389 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.389 "name": "raid_bdev1", 00:23:38.389 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:38.389 "strip_size_kb": 0, 00:23:38.389 "state": "online", 00:23:38.389 "raid_level": "raid1", 00:23:38.389 "superblock": true, 00:23:38.389 "num_base_bdevs": 2, 00:23:38.389 "num_base_bdevs_discovered": 1, 00:23:38.389 "num_base_bdevs_operational": 1, 00:23:38.389 "base_bdevs_list": [ 00:23:38.389 { 00:23:38.389 "name": null, 00:23:38.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.389 "is_configured": false, 00:23:38.389 "data_offset": 256, 00:23:38.389 "data_size": 7936 00:23:38.389 }, 00:23:38.389 { 00:23:38.389 "name": "BaseBdev2", 00:23:38.389 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:38.389 "is_configured": true, 00:23:38.389 "data_offset": 256, 00:23:38.389 "data_size": 7936 00:23:38.389 } 00:23:38.389 ] 00:23:38.389 }' 00:23:38.389 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.389 13:51:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.959 "name": "raid_bdev1", 00:23:38.959 "uuid": "f99e85a5-7894-44c7-81b2-befde4e32072", 00:23:38.959 "strip_size_kb": 0, 00:23:38.959 "state": "online", 00:23:38.959 "raid_level": "raid1", 00:23:38.959 "superblock": true, 00:23:38.959 "num_base_bdevs": 2, 00:23:38.959 "num_base_bdevs_discovered": 1, 00:23:38.959 "num_base_bdevs_operational": 1, 00:23:38.959 "base_bdevs_list": [ 00:23:38.959 { 00:23:38.959 "name": null, 00:23:38.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.959 "is_configured": false, 00:23:38.959 "data_offset": 256, 00:23:38.959 "data_size": 7936 00:23:38.959 }, 00:23:38.959 { 00:23:38.959 "name": "BaseBdev2", 00:23:38.959 "uuid": "834a64a3-b312-5b15-b554-a10eb732976e", 00:23:38.959 "is_configured": true, 00:23:38.959 "data_offset": 256, 00:23:38.959 "data_size": 7936 00:23:38.959 } 00:23:38.959 ] 00:23:38.959 }' 00:23:38.959 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.219 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:39.219 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.219 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1668416 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 1668416 ']' 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 1668416 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1668416 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1668416' 00:23:39.220 killing process with pid 1668416 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # kill 1668416 00:23:39.220 Received shutdown signal, test time was about 60.000000 seconds 00:23:39.220 00:23:39.220 Latency(us) 00:23:39.220 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:39.220 =================================================================================================================== 00:23:39.220 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:39.220 [2024-06-10 13:51:53.559348] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:39.220 [2024-06-10 13:51:53.559420] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:39.220 [2024-06-10 13:51:53.559452] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:39.220 [2024-06-10 13:51:53.559459] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15f4a80 name raid_bdev1, state offline 00:23:39.220 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@973 -- # wait 1668416 00:23:39.220 [2024-06-10 13:51:53.574905] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:39.480 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:23:39.480 00:23:39.480 real 0m28.247s 00:23:39.480 user 0m44.339s 00:23:39.480 sys 0m3.588s 00:23:39.480 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:39.480 13:51:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:23:39.480 ************************************ 00:23:39.480 END TEST raid_rebuild_test_sb_4k 00:23:39.480 ************************************ 00:23:39.480 13:51:53 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:23:39.480 13:51:53 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:23:39.480 13:51:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:23:39.480 13:51:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:39.480 13:51:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:39.480 ************************************ 00:23:39.480 START TEST raid_state_function_test_sb_md_separate 00:23:39.480 ************************************ 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1674207 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1674207' 00:23:39.480 Process raid pid: 1674207 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1674207 /var/tmp/spdk-raid.sock 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 1674207 ']' 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:39.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:39.480 13:51:53 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:39.480 [2024-06-10 13:51:53.841479] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:23:39.480 [2024-06-10 13:51:53.841528] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:39.481 [2024-06-10 13:51:53.932702] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:39.741 [2024-06-10 13:51:54.002209] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:23:39.741 [2024-06-10 13:51:54.048282] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:39.741 [2024-06-10 13:51:54.048304] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:40.311 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:40.311 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:23:40.311 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:40.571 [2024-06-10 13:51:54.888185] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:40.571 [2024-06-10 13:51:54.888216] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:40.571 [2024-06-10 13:51:54.888222] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:40.571 [2024-06-10 13:51:54.888228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.571 13:51:54 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:40.831 13:51:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.831 "name": "Existed_Raid", 00:23:40.831 "uuid": "d1dde28f-a2be-4b44-bbf4-ea34fd10e5f6", 00:23:40.831 "strip_size_kb": 0, 00:23:40.831 "state": "configuring", 00:23:40.831 "raid_level": "raid1", 00:23:40.831 "superblock": true, 00:23:40.831 "num_base_bdevs": 2, 00:23:40.831 "num_base_bdevs_discovered": 0, 00:23:40.831 "num_base_bdevs_operational": 2, 00:23:40.831 "base_bdevs_list": [ 00:23:40.831 { 00:23:40.831 "name": "BaseBdev1", 00:23:40.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.831 "is_configured": false, 00:23:40.831 "data_offset": 0, 00:23:40.831 "data_size": 0 00:23:40.831 }, 00:23:40.831 { 00:23:40.831 "name": "BaseBdev2", 00:23:40.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.831 "is_configured": false, 00:23:40.831 "data_offset": 0, 00:23:40.831 "data_size": 0 00:23:40.831 } 00:23:40.831 ] 00:23:40.831 }' 00:23:40.831 13:51:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.831 13:51:55 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:41.401 13:51:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:41.401 [2024-06-10 13:51:55.822425] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:41.401 [2024-06-10 13:51:55.822443] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbf7720 name Existed_Raid, state configuring 00:23:41.401 13:51:55 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:41.660 [2024-06-10 13:51:56.022953] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:41.660 [2024-06-10 13:51:56.022970] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:41.660 [2024-06-10 13:51:56.022976] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:41.660 [2024-06-10 13:51:56.022982] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:41.660 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:23:41.919 [2024-06-10 13:51:56.230786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:41.919 BaseBdev1 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:41.919 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:42.179 [ 00:23:42.179 { 00:23:42.179 "name": "BaseBdev1", 00:23:42.179 "aliases": [ 00:23:42.179 "1b041abc-916c-4bda-a529-89b2d3f3e88a" 00:23:42.179 ], 00:23:42.179 "product_name": "Malloc disk", 00:23:42.179 "block_size": 4096, 00:23:42.179 "num_blocks": 8192, 00:23:42.179 "uuid": "1b041abc-916c-4bda-a529-89b2d3f3e88a", 00:23:42.179 "md_size": 32, 00:23:42.179 "md_interleave": false, 00:23:42.179 "dif_type": 0, 00:23:42.179 "assigned_rate_limits": { 00:23:42.179 "rw_ios_per_sec": 0, 00:23:42.179 "rw_mbytes_per_sec": 0, 00:23:42.179 "r_mbytes_per_sec": 0, 00:23:42.179 "w_mbytes_per_sec": 0 00:23:42.179 }, 00:23:42.179 "claimed": true, 00:23:42.179 "claim_type": "exclusive_write", 00:23:42.179 "zoned": false, 00:23:42.179 "supported_io_types": { 00:23:42.179 "read": true, 00:23:42.179 "write": true, 00:23:42.179 "unmap": true, 00:23:42.179 "write_zeroes": true, 00:23:42.179 "flush": true, 00:23:42.179 "reset": true, 00:23:42.179 "compare": false, 00:23:42.179 "compare_and_write": false, 00:23:42.179 "abort": true, 00:23:42.179 "nvme_admin": false, 00:23:42.179 "nvme_io": false 00:23:42.179 }, 00:23:42.179 "memory_domains": [ 00:23:42.179 { 00:23:42.179 "dma_device_id": "system", 00:23:42.179 "dma_device_type": 1 00:23:42.179 }, 00:23:42.179 { 00:23:42.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.179 "dma_device_type": 2 00:23:42.179 } 00:23:42.179 ], 00:23:42.179 "driver_specific": {} 00:23:42.179 } 00:23:42.179 ] 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.179 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:42.440 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:42.440 "name": "Existed_Raid", 00:23:42.440 "uuid": "14ade64a-d371-48a3-9651-3554f9910382", 00:23:42.440 "strip_size_kb": 0, 00:23:42.440 "state": "configuring", 00:23:42.440 "raid_level": "raid1", 00:23:42.440 "superblock": true, 00:23:42.440 "num_base_bdevs": 2, 00:23:42.440 "num_base_bdevs_discovered": 1, 00:23:42.440 "num_base_bdevs_operational": 2, 00:23:42.440 "base_bdevs_list": [ 00:23:42.440 { 00:23:42.440 "name": "BaseBdev1", 00:23:42.440 "uuid": "1b041abc-916c-4bda-a529-89b2d3f3e88a", 00:23:42.440 "is_configured": true, 00:23:42.440 "data_offset": 256, 00:23:42.440 "data_size": 7936 00:23:42.440 }, 00:23:42.440 { 00:23:42.440 "name": "BaseBdev2", 00:23:42.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:42.440 "is_configured": false, 00:23:42.440 "data_offset": 0, 00:23:42.440 "data_size": 0 00:23:42.440 } 00:23:42.440 ] 00:23:42.440 }' 00:23:42.440 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:42.440 13:51:56 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:43.012 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:43.272 [2024-06-10 13:51:57.558158] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:43.272 [2024-06-10 13:51:57.558189] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbf7010 name Existed_Raid, state configuring 00:23:43.272 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:43.533 [2024-06-10 13:51:57.758697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:43.533 [2024-06-10 13:51:57.759884] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:43.533 [2024-06-10 13:51:57.759907] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.533 "name": "Existed_Raid", 00:23:43.533 "uuid": "25d3c03a-f0ce-4473-9f4b-e06d5ad159d1", 00:23:43.533 "strip_size_kb": 0, 00:23:43.533 "state": "configuring", 00:23:43.533 "raid_level": "raid1", 00:23:43.533 "superblock": true, 00:23:43.533 "num_base_bdevs": 2, 00:23:43.533 "num_base_bdevs_discovered": 1, 00:23:43.533 "num_base_bdevs_operational": 2, 00:23:43.533 "base_bdevs_list": [ 00:23:43.533 { 00:23:43.533 "name": "BaseBdev1", 00:23:43.533 "uuid": "1b041abc-916c-4bda-a529-89b2d3f3e88a", 00:23:43.533 "is_configured": true, 00:23:43.533 "data_offset": 256, 00:23:43.533 "data_size": 7936 00:23:43.533 }, 00:23:43.533 { 00:23:43.533 "name": "BaseBdev2", 00:23:43.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.533 "is_configured": false, 00:23:43.533 "data_offset": 0, 00:23:43.533 "data_size": 0 00:23:43.533 } 00:23:43.533 ] 00:23:43.533 }' 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.533 13:51:57 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:44.105 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:23:44.365 [2024-06-10 13:51:58.714635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:44.365 [2024-06-10 13:51:58.714741] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbf90d0 00:23:44.365 [2024-06-10 13:51:58.714749] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:44.365 [2024-06-10 13:51:58.714795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbf8b10 00:23:44.365 [2024-06-10 13:51:58.714873] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbf90d0 00:23:44.365 [2024-06-10 13:51:58.714879] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbf90d0 00:23:44.366 [2024-06-10 13:51:58.714932] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:44.366 BaseBdev2 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:23:44.366 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:44.626 13:51:58 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:44.887 [ 00:23:44.887 { 00:23:44.887 "name": "BaseBdev2", 00:23:44.887 "aliases": [ 00:23:44.887 "7a77ad7b-16aa-4585-a400-4850e51f898b" 00:23:44.887 ], 00:23:44.887 "product_name": "Malloc disk", 00:23:44.887 "block_size": 4096, 00:23:44.887 "num_blocks": 8192, 00:23:44.887 "uuid": "7a77ad7b-16aa-4585-a400-4850e51f898b", 00:23:44.887 "md_size": 32, 00:23:44.887 "md_interleave": false, 00:23:44.887 "dif_type": 0, 00:23:44.887 "assigned_rate_limits": { 00:23:44.887 "rw_ios_per_sec": 0, 00:23:44.887 "rw_mbytes_per_sec": 0, 00:23:44.887 "r_mbytes_per_sec": 0, 00:23:44.887 "w_mbytes_per_sec": 0 00:23:44.887 }, 00:23:44.887 "claimed": true, 00:23:44.887 "claim_type": "exclusive_write", 00:23:44.887 "zoned": false, 00:23:44.887 "supported_io_types": { 00:23:44.887 "read": true, 00:23:44.887 "write": true, 00:23:44.887 "unmap": true, 00:23:44.887 "write_zeroes": true, 00:23:44.887 "flush": true, 00:23:44.887 "reset": true, 00:23:44.887 "compare": false, 00:23:44.887 "compare_and_write": false, 00:23:44.887 "abort": true, 00:23:44.887 "nvme_admin": false, 00:23:44.887 "nvme_io": false 00:23:44.887 }, 00:23:44.887 "memory_domains": [ 00:23:44.887 { 00:23:44.887 "dma_device_id": "system", 00:23:44.887 "dma_device_type": 1 00:23:44.887 }, 00:23:44.887 { 00:23:44.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:44.887 "dma_device_type": 2 00:23:44.887 } 00:23:44.887 ], 00:23:44.887 "driver_specific": {} 00:23:44.887 } 00:23:44.887 ] 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:44.887 "name": "Existed_Raid", 00:23:44.887 "uuid": "25d3c03a-f0ce-4473-9f4b-e06d5ad159d1", 00:23:44.887 "strip_size_kb": 0, 00:23:44.887 "state": "online", 00:23:44.887 "raid_level": "raid1", 00:23:44.887 "superblock": true, 00:23:44.887 "num_base_bdevs": 2, 00:23:44.887 "num_base_bdevs_discovered": 2, 00:23:44.887 "num_base_bdevs_operational": 2, 00:23:44.887 "base_bdevs_list": [ 00:23:44.887 { 00:23:44.887 "name": "BaseBdev1", 00:23:44.887 "uuid": "1b041abc-916c-4bda-a529-89b2d3f3e88a", 00:23:44.887 "is_configured": true, 00:23:44.887 "data_offset": 256, 00:23:44.887 "data_size": 7936 00:23:44.887 }, 00:23:44.887 { 00:23:44.887 "name": "BaseBdev2", 00:23:44.887 "uuid": "7a77ad7b-16aa-4585-a400-4850e51f898b", 00:23:44.887 "is_configured": true, 00:23:44.887 "data_offset": 256, 00:23:44.887 "data_size": 7936 00:23:44.887 } 00:23:44.887 ] 00:23:44.887 }' 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:44.887 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:45.459 13:51:59 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:45.719 [2024-06-10 13:52:00.026293] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:45.719 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:45.719 "name": "Existed_Raid", 00:23:45.719 "aliases": [ 00:23:45.719 "25d3c03a-f0ce-4473-9f4b-e06d5ad159d1" 00:23:45.719 ], 00:23:45.719 "product_name": "Raid Volume", 00:23:45.719 "block_size": 4096, 00:23:45.719 "num_blocks": 7936, 00:23:45.719 "uuid": "25d3c03a-f0ce-4473-9f4b-e06d5ad159d1", 00:23:45.719 "md_size": 32, 00:23:45.719 "md_interleave": false, 00:23:45.719 "dif_type": 0, 00:23:45.719 "assigned_rate_limits": { 00:23:45.719 "rw_ios_per_sec": 0, 00:23:45.719 "rw_mbytes_per_sec": 0, 00:23:45.719 "r_mbytes_per_sec": 0, 00:23:45.719 "w_mbytes_per_sec": 0 00:23:45.719 }, 00:23:45.719 "claimed": false, 00:23:45.719 "zoned": false, 00:23:45.719 "supported_io_types": { 00:23:45.719 "read": true, 00:23:45.719 "write": true, 00:23:45.719 "unmap": false, 00:23:45.719 "write_zeroes": true, 00:23:45.719 "flush": false, 00:23:45.719 "reset": true, 00:23:45.719 "compare": false, 00:23:45.719 "compare_and_write": false, 00:23:45.719 "abort": false, 00:23:45.719 "nvme_admin": false, 00:23:45.719 "nvme_io": false 00:23:45.719 }, 00:23:45.719 "memory_domains": [ 00:23:45.719 { 00:23:45.719 "dma_device_id": "system", 00:23:45.719 "dma_device_type": 1 00:23:45.719 }, 00:23:45.719 { 00:23:45.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:45.719 "dma_device_type": 2 00:23:45.719 }, 00:23:45.719 { 00:23:45.719 "dma_device_id": "system", 00:23:45.719 "dma_device_type": 1 00:23:45.719 }, 00:23:45.719 { 00:23:45.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:45.719 "dma_device_type": 2 00:23:45.719 } 00:23:45.719 ], 00:23:45.719 "driver_specific": { 00:23:45.719 "raid": { 00:23:45.719 "uuid": "25d3c03a-f0ce-4473-9f4b-e06d5ad159d1", 00:23:45.719 "strip_size_kb": 0, 00:23:45.719 "state": "online", 00:23:45.719 "raid_level": "raid1", 00:23:45.719 "superblock": true, 00:23:45.719 "num_base_bdevs": 2, 00:23:45.719 "num_base_bdevs_discovered": 2, 00:23:45.719 "num_base_bdevs_operational": 2, 00:23:45.719 "base_bdevs_list": [ 00:23:45.719 { 00:23:45.719 "name": "BaseBdev1", 00:23:45.719 "uuid": "1b041abc-916c-4bda-a529-89b2d3f3e88a", 00:23:45.719 "is_configured": true, 00:23:45.719 "data_offset": 256, 00:23:45.719 "data_size": 7936 00:23:45.720 }, 00:23:45.720 { 00:23:45.720 "name": "BaseBdev2", 00:23:45.720 "uuid": "7a77ad7b-16aa-4585-a400-4850e51f898b", 00:23:45.720 "is_configured": true, 00:23:45.720 "data_offset": 256, 00:23:45.720 "data_size": 7936 00:23:45.720 } 00:23:45.720 ] 00:23:45.720 } 00:23:45.720 } 00:23:45.720 }' 00:23:45.720 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:45.720 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:45.720 BaseBdev2' 00:23:45.720 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:45.720 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:45.720 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:45.980 "name": "BaseBdev1", 00:23:45.980 "aliases": [ 00:23:45.980 "1b041abc-916c-4bda-a529-89b2d3f3e88a" 00:23:45.980 ], 00:23:45.980 "product_name": "Malloc disk", 00:23:45.980 "block_size": 4096, 00:23:45.980 "num_blocks": 8192, 00:23:45.980 "uuid": "1b041abc-916c-4bda-a529-89b2d3f3e88a", 00:23:45.980 "md_size": 32, 00:23:45.980 "md_interleave": false, 00:23:45.980 "dif_type": 0, 00:23:45.980 "assigned_rate_limits": { 00:23:45.980 "rw_ios_per_sec": 0, 00:23:45.980 "rw_mbytes_per_sec": 0, 00:23:45.980 "r_mbytes_per_sec": 0, 00:23:45.980 "w_mbytes_per_sec": 0 00:23:45.980 }, 00:23:45.980 "claimed": true, 00:23:45.980 "claim_type": "exclusive_write", 00:23:45.980 "zoned": false, 00:23:45.980 "supported_io_types": { 00:23:45.980 "read": true, 00:23:45.980 "write": true, 00:23:45.980 "unmap": true, 00:23:45.980 "write_zeroes": true, 00:23:45.980 "flush": true, 00:23:45.980 "reset": true, 00:23:45.980 "compare": false, 00:23:45.980 "compare_and_write": false, 00:23:45.980 "abort": true, 00:23:45.980 "nvme_admin": false, 00:23:45.980 "nvme_io": false 00:23:45.980 }, 00:23:45.980 "memory_domains": [ 00:23:45.980 { 00:23:45.980 "dma_device_id": "system", 00:23:45.980 "dma_device_type": 1 00:23:45.980 }, 00:23:45.980 { 00:23:45.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:45.980 "dma_device_type": 2 00:23:45.980 } 00:23:45.980 ], 00:23:45.980 "driver_specific": {} 00:23:45.980 }' 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:45.980 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:46.241 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:46.501 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:46.501 "name": "BaseBdev2", 00:23:46.501 "aliases": [ 00:23:46.501 "7a77ad7b-16aa-4585-a400-4850e51f898b" 00:23:46.501 ], 00:23:46.501 "product_name": "Malloc disk", 00:23:46.501 "block_size": 4096, 00:23:46.501 "num_blocks": 8192, 00:23:46.501 "uuid": "7a77ad7b-16aa-4585-a400-4850e51f898b", 00:23:46.501 "md_size": 32, 00:23:46.501 "md_interleave": false, 00:23:46.501 "dif_type": 0, 00:23:46.501 "assigned_rate_limits": { 00:23:46.501 "rw_ios_per_sec": 0, 00:23:46.501 "rw_mbytes_per_sec": 0, 00:23:46.501 "r_mbytes_per_sec": 0, 00:23:46.501 "w_mbytes_per_sec": 0 00:23:46.501 }, 00:23:46.501 "claimed": true, 00:23:46.501 "claim_type": "exclusive_write", 00:23:46.501 "zoned": false, 00:23:46.501 "supported_io_types": { 00:23:46.501 "read": true, 00:23:46.501 "write": true, 00:23:46.501 "unmap": true, 00:23:46.501 "write_zeroes": true, 00:23:46.501 "flush": true, 00:23:46.501 "reset": true, 00:23:46.501 "compare": false, 00:23:46.501 "compare_and_write": false, 00:23:46.501 "abort": true, 00:23:46.501 "nvme_admin": false, 00:23:46.501 "nvme_io": false 00:23:46.501 }, 00:23:46.501 "memory_domains": [ 00:23:46.501 { 00:23:46.501 "dma_device_id": "system", 00:23:46.501 "dma_device_type": 1 00:23:46.501 }, 00:23:46.501 { 00:23:46.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:46.501 "dma_device_type": 2 00:23:46.501 } 00:23:46.501 ], 00:23:46.501 "driver_specific": {} 00:23:46.501 }' 00:23:46.501 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:46.501 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:46.501 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:46.501 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:46.501 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:46.761 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:46.761 13:52:00 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:46.761 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:46.761 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:23:46.761 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:46.761 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:46.761 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:46.761 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:47.021 [2024-06-10 13:52:01.325401] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.021 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:47.022 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.282 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.282 "name": "Existed_Raid", 00:23:47.282 "uuid": "25d3c03a-f0ce-4473-9f4b-e06d5ad159d1", 00:23:47.282 "strip_size_kb": 0, 00:23:47.282 "state": "online", 00:23:47.282 "raid_level": "raid1", 00:23:47.282 "superblock": true, 00:23:47.282 "num_base_bdevs": 2, 00:23:47.282 "num_base_bdevs_discovered": 1, 00:23:47.282 "num_base_bdevs_operational": 1, 00:23:47.282 "base_bdevs_list": [ 00:23:47.282 { 00:23:47.282 "name": null, 00:23:47.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.282 "is_configured": false, 00:23:47.282 "data_offset": 256, 00:23:47.282 "data_size": 7936 00:23:47.282 }, 00:23:47.282 { 00:23:47.282 "name": "BaseBdev2", 00:23:47.282 "uuid": "7a77ad7b-16aa-4585-a400-4850e51f898b", 00:23:47.282 "is_configured": true, 00:23:47.282 "data_offset": 256, 00:23:47.282 "data_size": 7936 00:23:47.282 } 00:23:47.282 ] 00:23:47.282 }' 00:23:47.282 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.282 13:52:01 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:47.852 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:48.112 [2024-06-10 13:52:02.510438] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:48.112 [2024-06-10 13:52:02.510508] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:48.112 [2024-06-10 13:52:02.517132] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:48.112 [2024-06-10 13:52:02.517156] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:48.112 [2024-06-10 13:52:02.517168] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbf90d0 name Existed_Raid, state offline 00:23:48.112 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:48.112 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:48.112 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.112 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1674207 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 1674207 ']' 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 1674207 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1674207 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1674207' 00:23:48.372 killing process with pid 1674207 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 1674207 00:23:48.372 [2024-06-10 13:52:02.788972] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:48.372 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 1674207 00:23:48.372 [2024-06-10 13:52:02.789597] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:48.633 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:23:48.633 00:23:48.633 real 0m9.134s 00:23:48.633 user 0m16.512s 00:23:48.633 sys 0m1.438s 00:23:48.633 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:48.633 13:52:02 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:48.633 ************************************ 00:23:48.633 END TEST raid_state_function_test_sb_md_separate 00:23:48.633 ************************************ 00:23:48.633 13:52:02 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:23:48.633 13:52:02 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:23:48.633 13:52:02 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:48.633 13:52:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:48.633 ************************************ 00:23:48.633 START TEST raid_superblock_test_md_separate 00:23:48.633 ************************************ 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1676337 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1676337 /var/tmp/spdk-raid.sock 00:23:48.633 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@830 -- # '[' -z 1676337 ']' 00:23:48.633 13:52:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:48.633 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:48.633 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:48.633 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:48.633 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:48.633 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:48.633 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:48.633 [2024-06-10 13:52:03.056146] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:23:48.633 [2024-06-10 13:52:03.056220] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1676337 ] 00:23:48.893 [2024-06-10 13:52:03.148575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:48.893 [2024-06-10 13:52:03.216975] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:23:48.893 [2024-06-10 13:52:03.262465] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:48.893 [2024-06-10 13:52:03.262489] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@863 -- # return 0 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:49.463 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:49.464 13:52:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:23:49.723 malloc1 00:23:49.723 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:49.984 [2024-06-10 13:52:04.302013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:49.984 [2024-06-10 13:52:04.302047] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.984 [2024-06-10 13:52:04.302060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ddaa0 00:23:49.984 [2024-06-10 13:52:04.302067] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.984 [2024-06-10 13:52:04.303338] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.984 [2024-06-10 13:52:04.303359] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:49.984 pt1 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:49.984 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:23:50.245 malloc2 00:23:50.245 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:50.245 [2024-06-10 13:52:04.705707] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:50.245 [2024-06-10 13:52:04.705735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:50.245 [2024-06-10 13:52:04.705746] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23dc590 00:23:50.245 [2024-06-10 13:52:04.705752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:50.245 [2024-06-10 13:52:04.706887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:50.245 [2024-06-10 13:52:04.706905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:50.245 pt2 00:23:50.505 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:50.506 [2024-06-10 13:52:04.906228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:50.506 [2024-06-10 13:52:04.907292] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:50.506 [2024-06-10 13:52:04.907405] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2239c90 00:23:50.506 [2024-06-10 13:52:04.907413] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:50.506 [2024-06-10 13:52:04.907465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223bd90 00:23:50.506 [2024-06-10 13:52:04.907556] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2239c90 00:23:50.506 [2024-06-10 13:52:04.907562] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2239c90 00:23:50.506 [2024-06-10 13:52:04.907614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.506 13:52:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.766 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.766 "name": "raid_bdev1", 00:23:50.766 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:50.766 "strip_size_kb": 0, 00:23:50.766 "state": "online", 00:23:50.766 "raid_level": "raid1", 00:23:50.766 "superblock": true, 00:23:50.766 "num_base_bdevs": 2, 00:23:50.766 "num_base_bdevs_discovered": 2, 00:23:50.766 "num_base_bdevs_operational": 2, 00:23:50.766 "base_bdevs_list": [ 00:23:50.766 { 00:23:50.766 "name": "pt1", 00:23:50.766 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:50.766 "is_configured": true, 00:23:50.766 "data_offset": 256, 00:23:50.766 "data_size": 7936 00:23:50.766 }, 00:23:50.766 { 00:23:50.766 "name": "pt2", 00:23:50.766 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:50.766 "is_configured": true, 00:23:50.766 "data_offset": 256, 00:23:50.766 "data_size": 7936 00:23:50.766 } 00:23:50.766 ] 00:23:50.766 }' 00:23:50.766 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.766 13:52:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:51.337 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:51.598 [2024-06-10 13:52:05.844772] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:51.598 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:51.598 "name": "raid_bdev1", 00:23:51.598 "aliases": [ 00:23:51.598 "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9" 00:23:51.598 ], 00:23:51.598 "product_name": "Raid Volume", 00:23:51.598 "block_size": 4096, 00:23:51.598 "num_blocks": 7936, 00:23:51.598 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:51.598 "md_size": 32, 00:23:51.598 "md_interleave": false, 00:23:51.598 "dif_type": 0, 00:23:51.598 "assigned_rate_limits": { 00:23:51.598 "rw_ios_per_sec": 0, 00:23:51.598 "rw_mbytes_per_sec": 0, 00:23:51.598 "r_mbytes_per_sec": 0, 00:23:51.598 "w_mbytes_per_sec": 0 00:23:51.598 }, 00:23:51.598 "claimed": false, 00:23:51.598 "zoned": false, 00:23:51.598 "supported_io_types": { 00:23:51.598 "read": true, 00:23:51.598 "write": true, 00:23:51.598 "unmap": false, 00:23:51.598 "write_zeroes": true, 00:23:51.598 "flush": false, 00:23:51.598 "reset": true, 00:23:51.598 "compare": false, 00:23:51.598 "compare_and_write": false, 00:23:51.598 "abort": false, 00:23:51.598 "nvme_admin": false, 00:23:51.598 "nvme_io": false 00:23:51.598 }, 00:23:51.598 "memory_domains": [ 00:23:51.598 { 00:23:51.598 "dma_device_id": "system", 00:23:51.598 "dma_device_type": 1 00:23:51.598 }, 00:23:51.598 { 00:23:51.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:51.598 "dma_device_type": 2 00:23:51.598 }, 00:23:51.598 { 00:23:51.598 "dma_device_id": "system", 00:23:51.598 "dma_device_type": 1 00:23:51.598 }, 00:23:51.598 { 00:23:51.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:51.598 "dma_device_type": 2 00:23:51.598 } 00:23:51.598 ], 00:23:51.598 "driver_specific": { 00:23:51.598 "raid": { 00:23:51.598 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:51.598 "strip_size_kb": 0, 00:23:51.598 "state": "online", 00:23:51.598 "raid_level": "raid1", 00:23:51.598 "superblock": true, 00:23:51.598 "num_base_bdevs": 2, 00:23:51.598 "num_base_bdevs_discovered": 2, 00:23:51.598 "num_base_bdevs_operational": 2, 00:23:51.598 "base_bdevs_list": [ 00:23:51.598 { 00:23:51.598 "name": "pt1", 00:23:51.598 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:51.598 "is_configured": true, 00:23:51.598 "data_offset": 256, 00:23:51.598 "data_size": 7936 00:23:51.598 }, 00:23:51.598 { 00:23:51.598 "name": "pt2", 00:23:51.598 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:51.598 "is_configured": true, 00:23:51.598 "data_offset": 256, 00:23:51.598 "data_size": 7936 00:23:51.598 } 00:23:51.598 ] 00:23:51.598 } 00:23:51.598 } 00:23:51.598 }' 00:23:51.598 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:51.598 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:51.598 pt2' 00:23:51.598 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:51.598 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:51.598 13:52:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:51.859 "name": "pt1", 00:23:51.859 "aliases": [ 00:23:51.859 "00000000-0000-0000-0000-000000000001" 00:23:51.859 ], 00:23:51.859 "product_name": "passthru", 00:23:51.859 "block_size": 4096, 00:23:51.859 "num_blocks": 8192, 00:23:51.859 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:51.859 "md_size": 32, 00:23:51.859 "md_interleave": false, 00:23:51.859 "dif_type": 0, 00:23:51.859 "assigned_rate_limits": { 00:23:51.859 "rw_ios_per_sec": 0, 00:23:51.859 "rw_mbytes_per_sec": 0, 00:23:51.859 "r_mbytes_per_sec": 0, 00:23:51.859 "w_mbytes_per_sec": 0 00:23:51.859 }, 00:23:51.859 "claimed": true, 00:23:51.859 "claim_type": "exclusive_write", 00:23:51.859 "zoned": false, 00:23:51.859 "supported_io_types": { 00:23:51.859 "read": true, 00:23:51.859 "write": true, 00:23:51.859 "unmap": true, 00:23:51.859 "write_zeroes": true, 00:23:51.859 "flush": true, 00:23:51.859 "reset": true, 00:23:51.859 "compare": false, 00:23:51.859 "compare_and_write": false, 00:23:51.859 "abort": true, 00:23:51.859 "nvme_admin": false, 00:23:51.859 "nvme_io": false 00:23:51.859 }, 00:23:51.859 "memory_domains": [ 00:23:51.859 { 00:23:51.859 "dma_device_id": "system", 00:23:51.859 "dma_device_type": 1 00:23:51.859 }, 00:23:51.859 { 00:23:51.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:51.859 "dma_device_type": 2 00:23:51.859 } 00:23:51.859 ], 00:23:51.859 "driver_specific": { 00:23:51.859 "passthru": { 00:23:51.859 "name": "pt1", 00:23:51.859 "base_bdev_name": "malloc1" 00:23:51.859 } 00:23:51.859 } 00:23:51.859 }' 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:51.859 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:52.121 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:52.382 "name": "pt2", 00:23:52.382 "aliases": [ 00:23:52.382 "00000000-0000-0000-0000-000000000002" 00:23:52.382 ], 00:23:52.382 "product_name": "passthru", 00:23:52.382 "block_size": 4096, 00:23:52.382 "num_blocks": 8192, 00:23:52.382 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:52.382 "md_size": 32, 00:23:52.382 "md_interleave": false, 00:23:52.382 "dif_type": 0, 00:23:52.382 "assigned_rate_limits": { 00:23:52.382 "rw_ios_per_sec": 0, 00:23:52.382 "rw_mbytes_per_sec": 0, 00:23:52.382 "r_mbytes_per_sec": 0, 00:23:52.382 "w_mbytes_per_sec": 0 00:23:52.382 }, 00:23:52.382 "claimed": true, 00:23:52.382 "claim_type": "exclusive_write", 00:23:52.382 "zoned": false, 00:23:52.382 "supported_io_types": { 00:23:52.382 "read": true, 00:23:52.382 "write": true, 00:23:52.382 "unmap": true, 00:23:52.382 "write_zeroes": true, 00:23:52.382 "flush": true, 00:23:52.382 "reset": true, 00:23:52.382 "compare": false, 00:23:52.382 "compare_and_write": false, 00:23:52.382 "abort": true, 00:23:52.382 "nvme_admin": false, 00:23:52.382 "nvme_io": false 00:23:52.382 }, 00:23:52.382 "memory_domains": [ 00:23:52.382 { 00:23:52.382 "dma_device_id": "system", 00:23:52.382 "dma_device_type": 1 00:23:52.382 }, 00:23:52.382 { 00:23:52.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:52.382 "dma_device_type": 2 00:23:52.382 } 00:23:52.382 ], 00:23:52.382 "driver_specific": { 00:23:52.382 "passthru": { 00:23:52.382 "name": "pt2", 00:23:52.382 "base_bdev_name": "malloc2" 00:23:52.382 } 00:23:52.382 } 00:23:52.382 }' 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:52.382 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:52.643 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:52.643 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:23:52.643 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:52.643 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:52.643 13:52:06 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:52.643 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:52.643 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:52.903 [2024-06-10 13:52:07.176148] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:52.903 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d4f50ee6-7eae-42b5-bd79-ca81dc224ea9 00:23:52.903 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z d4f50ee6-7eae-42b5-bd79-ca81dc224ea9 ']' 00:23:52.903 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:53.163 [2024-06-10 13:52:07.380481] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:53.163 [2024-06-10 13:52:07.380491] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:53.163 [2024-06-10 13:52:07.380533] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:53.163 [2024-06-10 13:52:07.380577] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:53.163 [2024-06-10 13:52:07.380583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2239c90 name raid_bdev1, state offline 00:23:53.163 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.163 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:53.163 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:53.163 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:53.163 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:53.163 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:53.423 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:53.423 13:52:07 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:53.684 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:53.684 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:53.945 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:53.945 [2024-06-10 13:52:08.411057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:53.945 [2024-06-10 13:52:08.412207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:53.945 [2024-06-10 13:52:08.412251] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:53.945 [2024-06-10 13:52:08.412279] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:53.945 [2024-06-10 13:52:08.412290] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:53.945 [2024-06-10 13:52:08.412295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2241a90 name raid_bdev1, state configuring 00:23:53.945 request: 00:23:53.945 { 00:23:53.945 "name": "raid_bdev1", 00:23:53.945 "raid_level": "raid1", 00:23:53.945 "base_bdevs": [ 00:23:53.945 "malloc1", 00:23:53.945 "malloc2" 00:23:53.945 ], 00:23:53.945 "superblock": false, 00:23:53.945 "method": "bdev_raid_create", 00:23:53.945 "req_id": 1 00:23:53.945 } 00:23:53.945 Got JSON-RPC error response 00:23:53.945 response: 00:23:53.945 { 00:23:53.945 "code": -17, 00:23:53.945 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:53.945 } 00:23:54.205 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # es=1 00:23:54.205 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:54.206 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:54.466 [2024-06-10 13:52:08.820045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:54.466 [2024-06-10 13:52:08.820068] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.466 [2024-06-10 13:52:08.820077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2238d20 00:23:54.466 [2024-06-10 13:52:08.820084] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.466 [2024-06-10 13:52:08.821286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.466 [2024-06-10 13:52:08.821305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:54.466 [2024-06-10 13:52:08.821334] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:54.466 [2024-06-10 13:52:08.821350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:54.466 pt1 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.466 13:52:08 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.726 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.726 "name": "raid_bdev1", 00:23:54.726 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:54.726 "strip_size_kb": 0, 00:23:54.726 "state": "configuring", 00:23:54.726 "raid_level": "raid1", 00:23:54.726 "superblock": true, 00:23:54.726 "num_base_bdevs": 2, 00:23:54.726 "num_base_bdevs_discovered": 1, 00:23:54.726 "num_base_bdevs_operational": 2, 00:23:54.726 "base_bdevs_list": [ 00:23:54.726 { 00:23:54.726 "name": "pt1", 00:23:54.726 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:54.726 "is_configured": true, 00:23:54.726 "data_offset": 256, 00:23:54.726 "data_size": 7936 00:23:54.726 }, 00:23:54.726 { 00:23:54.726 "name": null, 00:23:54.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:54.726 "is_configured": false, 00:23:54.726 "data_offset": 256, 00:23:54.726 "data_size": 7936 00:23:54.726 } 00:23:54.726 ] 00:23:54.726 }' 00:23:54.726 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.726 13:52:09 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:55.297 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:23:55.297 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:55.297 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:55.297 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:55.558 [2024-06-10 13:52:09.794532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:55.558 [2024-06-10 13:52:09.794560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.558 [2024-06-10 13:52:09.794571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2238560 00:23:55.558 [2024-06-10 13:52:09.794577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.558 [2024-06-10 13:52:09.794720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.558 [2024-06-10 13:52:09.794729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:55.558 [2024-06-10 13:52:09.794757] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:55.558 [2024-06-10 13:52:09.794769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:55.558 [2024-06-10 13:52:09.794843] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x223c7b0 00:23:55.558 [2024-06-10 13:52:09.794849] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:55.558 [2024-06-10 13:52:09.794894] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2238950 00:23:55.558 [2024-06-10 13:52:09.794977] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x223c7b0 00:23:55.558 [2024-06-10 13:52:09.794983] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x223c7b0 00:23:55.558 [2024-06-10 13:52:09.795039] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:55.558 pt2 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.558 13:52:09 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.558 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.558 "name": "raid_bdev1", 00:23:55.558 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:55.558 "strip_size_kb": 0, 00:23:55.558 "state": "online", 00:23:55.558 "raid_level": "raid1", 00:23:55.558 "superblock": true, 00:23:55.558 "num_base_bdevs": 2, 00:23:55.558 "num_base_bdevs_discovered": 2, 00:23:55.558 "num_base_bdevs_operational": 2, 00:23:55.558 "base_bdevs_list": [ 00:23:55.558 { 00:23:55.558 "name": "pt1", 00:23:55.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:55.558 "is_configured": true, 00:23:55.558 "data_offset": 256, 00:23:55.558 "data_size": 7936 00:23:55.558 }, 00:23:55.558 { 00:23:55.558 "name": "pt2", 00:23:55.558 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:55.558 "is_configured": true, 00:23:55.558 "data_offset": 256, 00:23:55.558 "data_size": 7936 00:23:55.558 } 00:23:55.558 ] 00:23:55.558 }' 00:23:55.558 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.558 13:52:10 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:56.500 [2024-06-10 13:52:10.797272] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:56.500 "name": "raid_bdev1", 00:23:56.500 "aliases": [ 00:23:56.500 "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9" 00:23:56.500 ], 00:23:56.500 "product_name": "Raid Volume", 00:23:56.500 "block_size": 4096, 00:23:56.500 "num_blocks": 7936, 00:23:56.500 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:56.500 "md_size": 32, 00:23:56.500 "md_interleave": false, 00:23:56.500 "dif_type": 0, 00:23:56.500 "assigned_rate_limits": { 00:23:56.500 "rw_ios_per_sec": 0, 00:23:56.500 "rw_mbytes_per_sec": 0, 00:23:56.500 "r_mbytes_per_sec": 0, 00:23:56.500 "w_mbytes_per_sec": 0 00:23:56.500 }, 00:23:56.500 "claimed": false, 00:23:56.500 "zoned": false, 00:23:56.500 "supported_io_types": { 00:23:56.500 "read": true, 00:23:56.500 "write": true, 00:23:56.500 "unmap": false, 00:23:56.500 "write_zeroes": true, 00:23:56.500 "flush": false, 00:23:56.500 "reset": true, 00:23:56.500 "compare": false, 00:23:56.500 "compare_and_write": false, 00:23:56.500 "abort": false, 00:23:56.500 "nvme_admin": false, 00:23:56.500 "nvme_io": false 00:23:56.500 }, 00:23:56.500 "memory_domains": [ 00:23:56.500 { 00:23:56.500 "dma_device_id": "system", 00:23:56.500 "dma_device_type": 1 00:23:56.500 }, 00:23:56.500 { 00:23:56.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.500 "dma_device_type": 2 00:23:56.500 }, 00:23:56.500 { 00:23:56.500 "dma_device_id": "system", 00:23:56.500 "dma_device_type": 1 00:23:56.500 }, 00:23:56.500 { 00:23:56.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.500 "dma_device_type": 2 00:23:56.500 } 00:23:56.500 ], 00:23:56.500 "driver_specific": { 00:23:56.500 "raid": { 00:23:56.500 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:56.500 "strip_size_kb": 0, 00:23:56.500 "state": "online", 00:23:56.500 "raid_level": "raid1", 00:23:56.500 "superblock": true, 00:23:56.500 "num_base_bdevs": 2, 00:23:56.500 "num_base_bdevs_discovered": 2, 00:23:56.500 "num_base_bdevs_operational": 2, 00:23:56.500 "base_bdevs_list": [ 00:23:56.500 { 00:23:56.500 "name": "pt1", 00:23:56.500 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:56.500 "is_configured": true, 00:23:56.500 "data_offset": 256, 00:23:56.500 "data_size": 7936 00:23:56.500 }, 00:23:56.500 { 00:23:56.500 "name": "pt2", 00:23:56.500 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:56.500 "is_configured": true, 00:23:56.500 "data_offset": 256, 00:23:56.500 "data_size": 7936 00:23:56.500 } 00:23:56.500 ] 00:23:56.500 } 00:23:56.500 } 00:23:56.500 }' 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:56.500 pt2' 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:56.500 13:52:10 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:56.761 "name": "pt1", 00:23:56.761 "aliases": [ 00:23:56.761 "00000000-0000-0000-0000-000000000001" 00:23:56.761 ], 00:23:56.761 "product_name": "passthru", 00:23:56.761 "block_size": 4096, 00:23:56.761 "num_blocks": 8192, 00:23:56.761 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:56.761 "md_size": 32, 00:23:56.761 "md_interleave": false, 00:23:56.761 "dif_type": 0, 00:23:56.761 "assigned_rate_limits": { 00:23:56.761 "rw_ios_per_sec": 0, 00:23:56.761 "rw_mbytes_per_sec": 0, 00:23:56.761 "r_mbytes_per_sec": 0, 00:23:56.761 "w_mbytes_per_sec": 0 00:23:56.761 }, 00:23:56.761 "claimed": true, 00:23:56.761 "claim_type": "exclusive_write", 00:23:56.761 "zoned": false, 00:23:56.761 "supported_io_types": { 00:23:56.761 "read": true, 00:23:56.761 "write": true, 00:23:56.761 "unmap": true, 00:23:56.761 "write_zeroes": true, 00:23:56.761 "flush": true, 00:23:56.761 "reset": true, 00:23:56.761 "compare": false, 00:23:56.761 "compare_and_write": false, 00:23:56.761 "abort": true, 00:23:56.761 "nvme_admin": false, 00:23:56.761 "nvme_io": false 00:23:56.761 }, 00:23:56.761 "memory_domains": [ 00:23:56.761 { 00:23:56.761 "dma_device_id": "system", 00:23:56.761 "dma_device_type": 1 00:23:56.761 }, 00:23:56.761 { 00:23:56.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.761 "dma_device_type": 2 00:23:56.761 } 00:23:56.761 ], 00:23:56.761 "driver_specific": { 00:23:56.761 "passthru": { 00:23:56.761 "name": "pt1", 00:23:56.761 "base_bdev_name": "malloc1" 00:23:56.761 } 00:23:56.761 } 00:23:56.761 }' 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:56.761 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:57.021 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:57.283 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:57.283 "name": "pt2", 00:23:57.283 "aliases": [ 00:23:57.283 "00000000-0000-0000-0000-000000000002" 00:23:57.283 ], 00:23:57.283 "product_name": "passthru", 00:23:57.283 "block_size": 4096, 00:23:57.283 "num_blocks": 8192, 00:23:57.283 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:57.283 "md_size": 32, 00:23:57.283 "md_interleave": false, 00:23:57.283 "dif_type": 0, 00:23:57.283 "assigned_rate_limits": { 00:23:57.283 "rw_ios_per_sec": 0, 00:23:57.283 "rw_mbytes_per_sec": 0, 00:23:57.283 "r_mbytes_per_sec": 0, 00:23:57.283 "w_mbytes_per_sec": 0 00:23:57.283 }, 00:23:57.283 "claimed": true, 00:23:57.283 "claim_type": "exclusive_write", 00:23:57.283 "zoned": false, 00:23:57.283 "supported_io_types": { 00:23:57.283 "read": true, 00:23:57.283 "write": true, 00:23:57.283 "unmap": true, 00:23:57.283 "write_zeroes": true, 00:23:57.283 "flush": true, 00:23:57.283 "reset": true, 00:23:57.283 "compare": false, 00:23:57.283 "compare_and_write": false, 00:23:57.283 "abort": true, 00:23:57.283 "nvme_admin": false, 00:23:57.283 "nvme_io": false 00:23:57.283 }, 00:23:57.283 "memory_domains": [ 00:23:57.283 { 00:23:57.283 "dma_device_id": "system", 00:23:57.283 "dma_device_type": 1 00:23:57.283 }, 00:23:57.283 { 00:23:57.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:57.283 "dma_device_type": 2 00:23:57.283 } 00:23:57.283 ], 00:23:57.283 "driver_specific": { 00:23:57.283 "passthru": { 00:23:57.283 "name": "pt2", 00:23:57.283 "base_bdev_name": "malloc2" 00:23:57.283 } 00:23:57.283 } 00:23:57.283 }' 00:23:57.283 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.283 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.283 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:23:57.283 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.283 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:57.544 13:52:11 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:57.804 [2024-06-10 13:52:12.164742] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:57.804 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' d4f50ee6-7eae-42b5-bd79-ca81dc224ea9 '!=' d4f50ee6-7eae-42b5-bd79-ca81dc224ea9 ']' 00:23:57.804 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:57.804 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:57.804 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:23:57.804 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:58.065 [2024-06-10 13:52:12.369094] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.065 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.325 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:58.325 "name": "raid_bdev1", 00:23:58.325 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:58.325 "strip_size_kb": 0, 00:23:58.325 "state": "online", 00:23:58.325 "raid_level": "raid1", 00:23:58.325 "superblock": true, 00:23:58.325 "num_base_bdevs": 2, 00:23:58.325 "num_base_bdevs_discovered": 1, 00:23:58.325 "num_base_bdevs_operational": 1, 00:23:58.325 "base_bdevs_list": [ 00:23:58.325 { 00:23:58.325 "name": null, 00:23:58.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.325 "is_configured": false, 00:23:58.325 "data_offset": 256, 00:23:58.325 "data_size": 7936 00:23:58.325 }, 00:23:58.325 { 00:23:58.325 "name": "pt2", 00:23:58.325 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:58.325 "is_configured": true, 00:23:58.325 "data_offset": 256, 00:23:58.325 "data_size": 7936 00:23:58.325 } 00:23:58.325 ] 00:23:58.325 }' 00:23:58.325 13:52:12 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:58.325 13:52:12 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:58.897 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:58.897 [2024-06-10 13:52:13.307457] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:58.897 [2024-06-10 13:52:13.307472] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:58.897 [2024-06-10 13:52:13.307505] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:58.897 [2024-06-10 13:52:13.307540] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:58.897 [2024-06-10 13:52:13.307546] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x223c7b0 name raid_bdev1, state offline 00:23:58.897 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.897 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:59.157 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:59.157 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:59.157 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:59.157 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:59.157 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:59.417 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:59.417 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:59.417 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:59.417 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:59.417 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:23:59.417 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:59.677 [2024-06-10 13:52:13.916976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:59.677 [2024-06-10 13:52:13.917002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.677 [2024-06-10 13:52:13.917012] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x223d560 00:23:59.677 [2024-06-10 13:52:13.917019] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.677 [2024-06-10 13:52:13.918253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.677 [2024-06-10 13:52:13.918272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:59.677 [2024-06-10 13:52:13.918305] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:59.677 [2024-06-10 13:52:13.918323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:59.677 [2024-06-10 13:52:13.918382] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x223cef0 00:23:59.677 [2024-06-10 13:52:13.918387] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:59.677 [2024-06-10 13:52:13.918430] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223cd50 00:23:59.677 [2024-06-10 13:52:13.918510] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x223cef0 00:23:59.677 [2024-06-10 13:52:13.918516] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x223cef0 00:23:59.677 [2024-06-10 13:52:13.918566] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:59.677 pt2 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.677 13:52:13 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.677 13:52:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.677 "name": "raid_bdev1", 00:23:59.677 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:23:59.677 "strip_size_kb": 0, 00:23:59.677 "state": "online", 00:23:59.677 "raid_level": "raid1", 00:23:59.677 "superblock": true, 00:23:59.677 "num_base_bdevs": 2, 00:23:59.677 "num_base_bdevs_discovered": 1, 00:23:59.677 "num_base_bdevs_operational": 1, 00:23:59.677 "base_bdevs_list": [ 00:23:59.677 { 00:23:59.677 "name": null, 00:23:59.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.677 "is_configured": false, 00:23:59.677 "data_offset": 256, 00:23:59.677 "data_size": 7936 00:23:59.677 }, 00:23:59.677 { 00:23:59.677 "name": "pt2", 00:23:59.677 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:59.677 "is_configured": true, 00:23:59.677 "data_offset": 256, 00:23:59.677 "data_size": 7936 00:23:59.677 } 00:23:59.677 ] 00:23:59.677 }' 00:23:59.677 13:52:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.677 13:52:14 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:00.246 13:52:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:00.505 [2024-06-10 13:52:14.887424] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:00.505 [2024-06-10 13:52:14.887439] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:00.505 [2024-06-10 13:52:14.887476] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.505 [2024-06-10 13:52:14.887511] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.505 [2024-06-10 13:52:14.887517] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x223cef0 name raid_bdev1, state offline 00:24:00.505 13:52:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.505 13:52:14 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:00.765 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:00.765 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:00.765 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:24:00.765 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:01.026 [2024-06-10 13:52:15.296450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:01.026 [2024-06-10 13:52:15.296477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:01.026 [2024-06-10 13:52:15.296487] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x223b8f0 00:24:01.026 [2024-06-10 13:52:15.296494] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:01.026 [2024-06-10 13:52:15.297723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:01.026 [2024-06-10 13:52:15.297742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:01.026 [2024-06-10 13:52:15.297775] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:01.026 [2024-06-10 13:52:15.297791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:01.026 [2024-06-10 13:52:15.297861] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:01.026 [2024-06-10 13:52:15.297868] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:01.026 [2024-06-10 13:52:15.297877] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2240170 name raid_bdev1, state configuring 00:24:01.026 [2024-06-10 13:52:15.297892] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:01.026 [2024-06-10 13:52:15.297931] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2240d50 00:24:01.026 [2024-06-10 13:52:15.297937] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:01.026 [2024-06-10 13:52:15.297980] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223ada0 00:24:01.026 [2024-06-10 13:52:15.298059] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2240d50 00:24:01.026 [2024-06-10 13:52:15.298064] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2240d50 00:24:01.026 [2024-06-10 13:52:15.298125] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.026 pt1 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.026 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.286 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.286 "name": "raid_bdev1", 00:24:01.286 "uuid": "d4f50ee6-7eae-42b5-bd79-ca81dc224ea9", 00:24:01.286 "strip_size_kb": 0, 00:24:01.286 "state": "online", 00:24:01.286 "raid_level": "raid1", 00:24:01.286 "superblock": true, 00:24:01.286 "num_base_bdevs": 2, 00:24:01.286 "num_base_bdevs_discovered": 1, 00:24:01.286 "num_base_bdevs_operational": 1, 00:24:01.286 "base_bdevs_list": [ 00:24:01.286 { 00:24:01.286 "name": null, 00:24:01.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.286 "is_configured": false, 00:24:01.286 "data_offset": 256, 00:24:01.286 "data_size": 7936 00:24:01.286 }, 00:24:01.286 { 00:24:01.286 "name": "pt2", 00:24:01.286 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:01.286 "is_configured": true, 00:24:01.286 "data_offset": 256, 00:24:01.286 "data_size": 7936 00:24:01.286 } 00:24:01.286 ] 00:24:01.286 }' 00:24:01.286 13:52:15 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.286 13:52:15 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:01.856 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:01.856 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:01.856 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:01.856 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:01.856 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:02.116 [2024-06-10 13:52:16.459576] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' d4f50ee6-7eae-42b5-bd79-ca81dc224ea9 '!=' d4f50ee6-7eae-42b5-bd79-ca81dc224ea9 ']' 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1676337 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@949 -- # '[' -z 1676337 ']' 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # kill -0 1676337 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # uname 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1676337 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1676337' 00:24:02.116 killing process with pid 1676337 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # kill 1676337 00:24:02.116 [2024-06-10 13:52:16.531951] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:02.116 [2024-06-10 13:52:16.531992] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:02.116 [2024-06-10 13:52:16.532024] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:02.116 [2024-06-10 13:52:16.532031] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2240d50 name raid_bdev1, state offline 00:24:02.116 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@973 -- # wait 1676337 00:24:02.116 [2024-06-10 13:52:16.544944] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:02.376 13:52:16 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:24:02.376 00:24:02.376 real 0m13.672s 00:24:02.376 user 0m25.320s 00:24:02.376 sys 0m2.085s 00:24:02.376 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:02.376 13:52:16 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:02.376 ************************************ 00:24:02.376 END TEST raid_superblock_test_md_separate 00:24:02.376 ************************************ 00:24:02.376 13:52:16 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:24:02.376 13:52:16 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:24:02.376 13:52:16 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:24:02.376 13:52:16 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:02.376 13:52:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:02.376 ************************************ 00:24:02.376 START TEST raid_rebuild_test_sb_md_separate 00:24:02.376 ************************************ 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:02.376 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1679259 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1679259 /var/tmp/spdk-raid.sock 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 1679259 ']' 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:02.377 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:02.377 13:52:16 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:02.377 [2024-06-10 13:52:16.810724] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:24:02.377 [2024-06-10 13:52:16.810784] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1679259 ] 00:24:02.377 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:02.377 Zero copy mechanism will not be used. 00:24:02.636 [2024-06-10 13:52:16.903193] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:02.636 [2024-06-10 13:52:16.972242] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:24:02.636 [2024-06-10 13:52:17.025535] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:02.636 [2024-06-10 13:52:17.025560] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:03.206 13:52:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:03.206 13:52:17 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:24:03.206 13:52:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:03.206 13:52:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:24:03.466 BaseBdev1_malloc 00:24:03.466 13:52:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:03.726 [2024-06-10 13:52:18.057189] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:03.726 [2024-06-10 13:52:18.057225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.726 [2024-06-10 13:52:18.057244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b43b0 00:24:03.726 [2024-06-10 13:52:18.057251] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.726 [2024-06-10 13:52:18.058474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.726 [2024-06-10 13:52:18.058494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:03.726 BaseBdev1 00:24:03.726 13:52:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:03.726 13:52:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:24:03.986 BaseBdev2_malloc 00:24:03.986 13:52:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:04.245 [2024-06-10 13:52:18.464832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:04.245 [2024-06-10 13:52:18.464859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.245 [2024-06-10 13:52:18.464873] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb0b220 00:24:04.245 [2024-06-10 13:52:18.464879] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.245 [2024-06-10 13:52:18.466014] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.245 [2024-06-10 13:52:18.466033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:04.245 BaseBdev2 00:24:04.245 13:52:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:24:04.245 spare_malloc 00:24:04.245 13:52:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:04.505 spare_delay 00:24:04.505 13:52:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:04.765 [2024-06-10 13:52:19.044794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:04.765 [2024-06-10 13:52:19.044824] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.765 [2024-06-10 13:52:19.044838] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaed260 00:24:04.765 [2024-06-10 13:52:19.044845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.765 [2024-06-10 13:52:19.046020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.765 [2024-06-10 13:52:19.046039] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:04.765 spare 00:24:04.765 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:05.025 [2024-06-10 13:52:19.245316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:05.025 [2024-06-10 13:52:19.246369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:05.026 [2024-06-10 13:52:19.246497] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xaedcc0 00:24:05.026 [2024-06-10 13:52:19.246505] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:05.026 [2024-06-10 13:52:19.246560] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b2520 00:24:05.026 [2024-06-10 13:52:19.246651] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xaedcc0 00:24:05.026 [2024-06-10 13:52:19.246657] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xaedcc0 00:24:05.026 [2024-06-10 13:52:19.246713] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.026 "name": "raid_bdev1", 00:24:05.026 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:05.026 "strip_size_kb": 0, 00:24:05.026 "state": "online", 00:24:05.026 "raid_level": "raid1", 00:24:05.026 "superblock": true, 00:24:05.026 "num_base_bdevs": 2, 00:24:05.026 "num_base_bdevs_discovered": 2, 00:24:05.026 "num_base_bdevs_operational": 2, 00:24:05.026 "base_bdevs_list": [ 00:24:05.026 { 00:24:05.026 "name": "BaseBdev1", 00:24:05.026 "uuid": "bf8accd2-7548-522a-9052-7c578d2719f4", 00:24:05.026 "is_configured": true, 00:24:05.026 "data_offset": 256, 00:24:05.026 "data_size": 7936 00:24:05.026 }, 00:24:05.026 { 00:24:05.026 "name": "BaseBdev2", 00:24:05.026 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:05.026 "is_configured": true, 00:24:05.026 "data_offset": 256, 00:24:05.026 "data_size": 7936 00:24:05.026 } 00:24:05.026 ] 00:24:05.026 }' 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.026 13:52:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:05.595 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:05.595 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:05.859 [2024-06-10 13:52:20.195935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:05.859 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:24:05.859 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.859 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:06.141 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:06.450 [2024-06-10 13:52:20.608799] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b33a0 00:24:06.450 /dev/nbd0 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:06.450 1+0 records in 00:24:06.450 1+0 records out 00:24:06.450 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282025 s, 14.5 MB/s 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:06.450 13:52:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:24:07.021 7936+0 records in 00:24:07.021 7936+0 records out 00:24:07.021 32505856 bytes (33 MB, 31 MiB) copied, 0.570233 s, 57.0 MB/s 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:07.021 [2024-06-10 13:52:21.459345] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:07.021 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:07.282 [2024-06-10 13:52:21.635833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.282 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.542 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.542 "name": "raid_bdev1", 00:24:07.542 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:07.542 "strip_size_kb": 0, 00:24:07.542 "state": "online", 00:24:07.542 "raid_level": "raid1", 00:24:07.542 "superblock": true, 00:24:07.542 "num_base_bdevs": 2, 00:24:07.542 "num_base_bdevs_discovered": 1, 00:24:07.542 "num_base_bdevs_operational": 1, 00:24:07.542 "base_bdevs_list": [ 00:24:07.542 { 00:24:07.542 "name": null, 00:24:07.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.542 "is_configured": false, 00:24:07.542 "data_offset": 256, 00:24:07.542 "data_size": 7936 00:24:07.542 }, 00:24:07.542 { 00:24:07.542 "name": "BaseBdev2", 00:24:07.542 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:07.542 "is_configured": true, 00:24:07.542 "data_offset": 256, 00:24:07.542 "data_size": 7936 00:24:07.542 } 00:24:07.542 ] 00:24:07.542 }' 00:24:07.542 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.542 13:52:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:08.114 13:52:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:08.114 [2024-06-10 13:52:22.578229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:08.114 [2024-06-10 13:52:22.579910] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b2f80 00:24:08.114 [2024-06-10 13:52:22.581640] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:08.375 13:52:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.317 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.579 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.579 "name": "raid_bdev1", 00:24:09.579 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:09.579 "strip_size_kb": 0, 00:24:09.579 "state": "online", 00:24:09.579 "raid_level": "raid1", 00:24:09.579 "superblock": true, 00:24:09.579 "num_base_bdevs": 2, 00:24:09.579 "num_base_bdevs_discovered": 2, 00:24:09.579 "num_base_bdevs_operational": 2, 00:24:09.579 "process": { 00:24:09.579 "type": "rebuild", 00:24:09.579 "target": "spare", 00:24:09.579 "progress": { 00:24:09.579 "blocks": 2816, 00:24:09.579 "percent": 35 00:24:09.579 } 00:24:09.579 }, 00:24:09.579 "base_bdevs_list": [ 00:24:09.579 { 00:24:09.579 "name": "spare", 00:24:09.579 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:09.579 "is_configured": true, 00:24:09.579 "data_offset": 256, 00:24:09.579 "data_size": 7936 00:24:09.579 }, 00:24:09.579 { 00:24:09.579 "name": "BaseBdev2", 00:24:09.579 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:09.579 "is_configured": true, 00:24:09.579 "data_offset": 256, 00:24:09.579 "data_size": 7936 00:24:09.579 } 00:24:09.579 ] 00:24:09.579 }' 00:24:09.579 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.579 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.579 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.579 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.579 13:52:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:09.840 [2024-06-10 13:52:24.078750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:09.840 [2024-06-10 13:52:24.090764] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:09.840 [2024-06-10 13:52:24.090796] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:09.840 [2024-06-10 13:52:24.090806] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:09.840 [2024-06-10 13:52:24.090811] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.840 "name": "raid_bdev1", 00:24:09.840 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:09.840 "strip_size_kb": 0, 00:24:09.840 "state": "online", 00:24:09.840 "raid_level": "raid1", 00:24:09.840 "superblock": true, 00:24:09.840 "num_base_bdevs": 2, 00:24:09.840 "num_base_bdevs_discovered": 1, 00:24:09.840 "num_base_bdevs_operational": 1, 00:24:09.840 "base_bdevs_list": [ 00:24:09.840 { 00:24:09.840 "name": null, 00:24:09.840 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.840 "is_configured": false, 00:24:09.840 "data_offset": 256, 00:24:09.840 "data_size": 7936 00:24:09.840 }, 00:24:09.840 { 00:24:09.840 "name": "BaseBdev2", 00:24:09.840 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:09.840 "is_configured": true, 00:24:09.840 "data_offset": 256, 00:24:09.840 "data_size": 7936 00:24:09.840 } 00:24:09.840 ] 00:24:09.840 }' 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.840 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:10.410 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:10.411 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.411 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:10.411 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:10.411 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.411 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.411 13:52:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.671 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.671 "name": "raid_bdev1", 00:24:10.671 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:10.671 "strip_size_kb": 0, 00:24:10.671 "state": "online", 00:24:10.671 "raid_level": "raid1", 00:24:10.671 "superblock": true, 00:24:10.671 "num_base_bdevs": 2, 00:24:10.671 "num_base_bdevs_discovered": 1, 00:24:10.671 "num_base_bdevs_operational": 1, 00:24:10.671 "base_bdevs_list": [ 00:24:10.671 { 00:24:10.671 "name": null, 00:24:10.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.671 "is_configured": false, 00:24:10.671 "data_offset": 256, 00:24:10.671 "data_size": 7936 00:24:10.671 }, 00:24:10.671 { 00:24:10.671 "name": "BaseBdev2", 00:24:10.671 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:10.671 "is_configured": true, 00:24:10.671 "data_offset": 256, 00:24:10.671 "data_size": 7936 00:24:10.671 } 00:24:10.671 ] 00:24:10.671 }' 00:24:10.671 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.671 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:10.671 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.931 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:10.931 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:10.931 [2024-06-10 13:52:25.341956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:10.931 [2024-06-10 13:52:25.343643] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b2f80 00:24:10.931 [2024-06-10 13:52:25.344860] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:10.931 13:52:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.311 "name": "raid_bdev1", 00:24:12.311 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:12.311 "strip_size_kb": 0, 00:24:12.311 "state": "online", 00:24:12.311 "raid_level": "raid1", 00:24:12.311 "superblock": true, 00:24:12.311 "num_base_bdevs": 2, 00:24:12.311 "num_base_bdevs_discovered": 2, 00:24:12.311 "num_base_bdevs_operational": 2, 00:24:12.311 "process": { 00:24:12.311 "type": "rebuild", 00:24:12.311 "target": "spare", 00:24:12.311 "progress": { 00:24:12.311 "blocks": 2816, 00:24:12.311 "percent": 35 00:24:12.311 } 00:24:12.311 }, 00:24:12.311 "base_bdevs_list": [ 00:24:12.311 { 00:24:12.311 "name": "spare", 00:24:12.311 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:12.311 "is_configured": true, 00:24:12.311 "data_offset": 256, 00:24:12.311 "data_size": 7936 00:24:12.311 }, 00:24:12.311 { 00:24:12.311 "name": "BaseBdev2", 00:24:12.311 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:12.311 "is_configured": true, 00:24:12.311 "data_offset": 256, 00:24:12.311 "data_size": 7936 00:24:12.311 } 00:24:12.311 ] 00:24:12.311 }' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:12.311 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=934 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.311 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.312 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.572 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:12.572 "name": "raid_bdev1", 00:24:12.572 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:12.572 "strip_size_kb": 0, 00:24:12.572 "state": "online", 00:24:12.572 "raid_level": "raid1", 00:24:12.572 "superblock": true, 00:24:12.572 "num_base_bdevs": 2, 00:24:12.572 "num_base_bdevs_discovered": 2, 00:24:12.572 "num_base_bdevs_operational": 2, 00:24:12.572 "process": { 00:24:12.572 "type": "rebuild", 00:24:12.572 "target": "spare", 00:24:12.572 "progress": { 00:24:12.572 "blocks": 3584, 00:24:12.572 "percent": 45 00:24:12.572 } 00:24:12.572 }, 00:24:12.572 "base_bdevs_list": [ 00:24:12.572 { 00:24:12.572 "name": "spare", 00:24:12.572 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:12.572 "is_configured": true, 00:24:12.572 "data_offset": 256, 00:24:12.572 "data_size": 7936 00:24:12.572 }, 00:24:12.572 { 00:24:12.572 "name": "BaseBdev2", 00:24:12.572 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:12.572 "is_configured": true, 00:24:12.572 "data_offset": 256, 00:24:12.572 "data_size": 7936 00:24:12.572 } 00:24:12.572 ] 00:24:12.572 }' 00:24:12.572 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.572 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:12.572 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.572 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:12.572 13:52:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.511 13:52:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.771 13:52:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.771 "name": "raid_bdev1", 00:24:13.771 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:13.771 "strip_size_kb": 0, 00:24:13.771 "state": "online", 00:24:13.771 "raid_level": "raid1", 00:24:13.771 "superblock": true, 00:24:13.771 "num_base_bdevs": 2, 00:24:13.771 "num_base_bdevs_discovered": 2, 00:24:13.771 "num_base_bdevs_operational": 2, 00:24:13.771 "process": { 00:24:13.771 "type": "rebuild", 00:24:13.771 "target": "spare", 00:24:13.771 "progress": { 00:24:13.771 "blocks": 6912, 00:24:13.771 "percent": 87 00:24:13.771 } 00:24:13.771 }, 00:24:13.771 "base_bdevs_list": [ 00:24:13.771 { 00:24:13.771 "name": "spare", 00:24:13.771 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:13.771 "is_configured": true, 00:24:13.771 "data_offset": 256, 00:24:13.771 "data_size": 7936 00:24:13.771 }, 00:24:13.771 { 00:24:13.771 "name": "BaseBdev2", 00:24:13.771 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:13.771 "is_configured": true, 00:24:13.771 "data_offset": 256, 00:24:13.771 "data_size": 7936 00:24:13.771 } 00:24:13.771 ] 00:24:13.771 }' 00:24:13.771 13:52:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.771 13:52:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:13.771 13:52:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.771 13:52:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:13.771 13:52:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:14.031 [2024-06-10 13:52:28.463504] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:14.031 [2024-06-10 13:52:28.463550] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:14.031 [2024-06-10 13:52:28.463626] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.970 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.230 "name": "raid_bdev1", 00:24:15.230 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:15.230 "strip_size_kb": 0, 00:24:15.230 "state": "online", 00:24:15.230 "raid_level": "raid1", 00:24:15.230 "superblock": true, 00:24:15.230 "num_base_bdevs": 2, 00:24:15.230 "num_base_bdevs_discovered": 2, 00:24:15.230 "num_base_bdevs_operational": 2, 00:24:15.230 "base_bdevs_list": [ 00:24:15.230 { 00:24:15.230 "name": "spare", 00:24:15.230 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:15.230 "is_configured": true, 00:24:15.230 "data_offset": 256, 00:24:15.230 "data_size": 7936 00:24:15.230 }, 00:24:15.230 { 00:24:15.230 "name": "BaseBdev2", 00:24:15.230 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:15.230 "is_configured": true, 00:24:15.230 "data_offset": 256, 00:24:15.230 "data_size": 7936 00:24:15.230 } 00:24:15.230 ] 00:24:15.230 }' 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.230 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.490 "name": "raid_bdev1", 00:24:15.490 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:15.490 "strip_size_kb": 0, 00:24:15.490 "state": "online", 00:24:15.490 "raid_level": "raid1", 00:24:15.490 "superblock": true, 00:24:15.490 "num_base_bdevs": 2, 00:24:15.490 "num_base_bdevs_discovered": 2, 00:24:15.490 "num_base_bdevs_operational": 2, 00:24:15.490 "base_bdevs_list": [ 00:24:15.490 { 00:24:15.490 "name": "spare", 00:24:15.490 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:15.490 "is_configured": true, 00:24:15.490 "data_offset": 256, 00:24:15.490 "data_size": 7936 00:24:15.490 }, 00:24:15.490 { 00:24:15.490 "name": "BaseBdev2", 00:24:15.490 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:15.490 "is_configured": true, 00:24:15.490 "data_offset": 256, 00:24:15.490 "data_size": 7936 00:24:15.490 } 00:24:15.490 ] 00:24:15.490 }' 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.490 13:52:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.750 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.750 "name": "raid_bdev1", 00:24:15.750 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:15.750 "strip_size_kb": 0, 00:24:15.750 "state": "online", 00:24:15.750 "raid_level": "raid1", 00:24:15.750 "superblock": true, 00:24:15.750 "num_base_bdevs": 2, 00:24:15.750 "num_base_bdevs_discovered": 2, 00:24:15.750 "num_base_bdevs_operational": 2, 00:24:15.750 "base_bdevs_list": [ 00:24:15.750 { 00:24:15.750 "name": "spare", 00:24:15.750 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:15.750 "is_configured": true, 00:24:15.750 "data_offset": 256, 00:24:15.750 "data_size": 7936 00:24:15.750 }, 00:24:15.750 { 00:24:15.750 "name": "BaseBdev2", 00:24:15.750 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:15.750 "is_configured": true, 00:24:15.750 "data_offset": 256, 00:24:15.750 "data_size": 7936 00:24:15.750 } 00:24:15.750 ] 00:24:15.750 }' 00:24:15.750 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.750 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:16.320 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:16.320 [2024-06-10 13:52:30.743282] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:16.320 [2024-06-10 13:52:30.743303] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:16.320 [2024-06-10 13:52:30.743355] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:16.320 [2024-06-10 13:52:30.743405] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:16.320 [2024-06-10 13:52:30.743412] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xaedcc0 name raid_bdev1, state offline 00:24:16.320 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.320 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:16.579 13:52:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:16.838 /dev/nbd0 00:24:16.838 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:16.839 1+0 records in 00:24:16.839 1+0 records out 00:24:16.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244237 s, 16.8 MB/s 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:16.839 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:17.098 /dev/nbd1 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:17.098 1+0 records in 00:24:17.098 1+0 records out 00:24:17.098 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000374964 s, 10.9 MB/s 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:17.098 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:17.358 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:17.618 13:52:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:17.877 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:17.877 [2024-06-10 13:52:32.343364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:17.877 [2024-06-10 13:52:32.343400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:17.877 [2024-06-10 13:52:32.343413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b2fd0 00:24:17.877 [2024-06-10 13:52:32.343419] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:17.877 [2024-06-10 13:52:32.344709] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:17.877 [2024-06-10 13:52:32.344731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:17.877 [2024-06-10 13:52:32.344778] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:17.877 [2024-06-10 13:52:32.344798] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:17.877 [2024-06-10 13:52:32.344874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:17.877 spare 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.136 [2024-06-10 13:52:32.445164] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb4e140 00:24:18.136 [2024-06-10 13:52:32.445173] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:24:18.136 [2024-06-10 13:52:32.445228] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaefed0 00:24:18.136 [2024-06-10 13:52:32.445320] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb4e140 00:24:18.136 [2024-06-10 13:52:32.445326] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb4e140 00:24:18.136 [2024-06-10 13:52:32.445382] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:18.136 "name": "raid_bdev1", 00:24:18.136 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:18.136 "strip_size_kb": 0, 00:24:18.136 "state": "online", 00:24:18.136 "raid_level": "raid1", 00:24:18.136 "superblock": true, 00:24:18.136 "num_base_bdevs": 2, 00:24:18.136 "num_base_bdevs_discovered": 2, 00:24:18.136 "num_base_bdevs_operational": 2, 00:24:18.136 "base_bdevs_list": [ 00:24:18.136 { 00:24:18.136 "name": "spare", 00:24:18.136 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:18.136 "is_configured": true, 00:24:18.136 "data_offset": 256, 00:24:18.136 "data_size": 7936 00:24:18.136 }, 00:24:18.136 { 00:24:18.136 "name": "BaseBdev2", 00:24:18.136 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:18.136 "is_configured": true, 00:24:18.136 "data_offset": 256, 00:24:18.136 "data_size": 7936 00:24:18.136 } 00:24:18.136 ] 00:24:18.136 }' 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:18.136 13:52:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.706 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.965 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.965 "name": "raid_bdev1", 00:24:18.965 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:18.965 "strip_size_kb": 0, 00:24:18.965 "state": "online", 00:24:18.965 "raid_level": "raid1", 00:24:18.965 "superblock": true, 00:24:18.965 "num_base_bdevs": 2, 00:24:18.965 "num_base_bdevs_discovered": 2, 00:24:18.965 "num_base_bdevs_operational": 2, 00:24:18.966 "base_bdevs_list": [ 00:24:18.966 { 00:24:18.966 "name": "spare", 00:24:18.966 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:18.966 "is_configured": true, 00:24:18.966 "data_offset": 256, 00:24:18.966 "data_size": 7936 00:24:18.966 }, 00:24:18.966 { 00:24:18.966 "name": "BaseBdev2", 00:24:18.966 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:18.966 "is_configured": true, 00:24:18.966 "data_offset": 256, 00:24:18.966 "data_size": 7936 00:24:18.966 } 00:24:18.966 ] 00:24:18.966 }' 00:24:18.966 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.966 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:18.966 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.966 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:18.966 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.966 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:19.225 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:19.225 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:19.485 [2024-06-10 13:52:33.799154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.485 13:52:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.745 13:52:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:19.745 "name": "raid_bdev1", 00:24:19.745 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:19.745 "strip_size_kb": 0, 00:24:19.745 "state": "online", 00:24:19.745 "raid_level": "raid1", 00:24:19.745 "superblock": true, 00:24:19.745 "num_base_bdevs": 2, 00:24:19.745 "num_base_bdevs_discovered": 1, 00:24:19.745 "num_base_bdevs_operational": 1, 00:24:19.745 "base_bdevs_list": [ 00:24:19.745 { 00:24:19.745 "name": null, 00:24:19.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:19.745 "is_configured": false, 00:24:19.745 "data_offset": 256, 00:24:19.745 "data_size": 7936 00:24:19.745 }, 00:24:19.745 { 00:24:19.745 "name": "BaseBdev2", 00:24:19.745 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:19.745 "is_configured": true, 00:24:19.745 "data_offset": 256, 00:24:19.745 "data_size": 7936 00:24:19.745 } 00:24:19.745 ] 00:24:19.745 }' 00:24:19.745 13:52:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:19.745 13:52:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:20.314 13:52:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:20.314 [2024-06-10 13:52:34.741558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.314 [2024-06-10 13:52:34.741680] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:20.314 [2024-06-10 13:52:34.741691] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:20.314 [2024-06-10 13:52:34.741708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.314 [2024-06-10 13:52:34.743288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaed4f0 00:24:20.314 [2024-06-10 13:52:34.744416] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:20.314 13:52:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:21.697 "name": "raid_bdev1", 00:24:21.697 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:21.697 "strip_size_kb": 0, 00:24:21.697 "state": "online", 00:24:21.697 "raid_level": "raid1", 00:24:21.697 "superblock": true, 00:24:21.697 "num_base_bdevs": 2, 00:24:21.697 "num_base_bdevs_discovered": 2, 00:24:21.697 "num_base_bdevs_operational": 2, 00:24:21.697 "process": { 00:24:21.697 "type": "rebuild", 00:24:21.697 "target": "spare", 00:24:21.697 "progress": { 00:24:21.697 "blocks": 3072, 00:24:21.697 "percent": 38 00:24:21.697 } 00:24:21.697 }, 00:24:21.697 "base_bdevs_list": [ 00:24:21.697 { 00:24:21.697 "name": "spare", 00:24:21.697 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:21.697 "is_configured": true, 00:24:21.697 "data_offset": 256, 00:24:21.697 "data_size": 7936 00:24:21.697 }, 00:24:21.697 { 00:24:21.697 "name": "BaseBdev2", 00:24:21.697 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:21.697 "is_configured": true, 00:24:21.697 "data_offset": 256, 00:24:21.697 "data_size": 7936 00:24:21.697 } 00:24:21.697 ] 00:24:21.697 }' 00:24:21.697 13:52:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:21.697 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:21.697 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:21.697 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:21.697 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:21.957 [2024-06-10 13:52:36.246099] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:21.957 [2024-06-10 13:52:36.253510] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:21.957 [2024-06-10 13:52:36.253542] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.957 [2024-06-10 13:52:36.253552] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:21.957 [2024-06-10 13:52:36.253556] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.957 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.958 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.218 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.218 "name": "raid_bdev1", 00:24:22.218 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:22.218 "strip_size_kb": 0, 00:24:22.218 "state": "online", 00:24:22.218 "raid_level": "raid1", 00:24:22.218 "superblock": true, 00:24:22.218 "num_base_bdevs": 2, 00:24:22.218 "num_base_bdevs_discovered": 1, 00:24:22.218 "num_base_bdevs_operational": 1, 00:24:22.218 "base_bdevs_list": [ 00:24:22.218 { 00:24:22.218 "name": null, 00:24:22.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.218 "is_configured": false, 00:24:22.218 "data_offset": 256, 00:24:22.218 "data_size": 7936 00:24:22.218 }, 00:24:22.218 { 00:24:22.219 "name": "BaseBdev2", 00:24:22.219 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:22.219 "is_configured": true, 00:24:22.219 "data_offset": 256, 00:24:22.219 "data_size": 7936 00:24:22.219 } 00:24:22.219 ] 00:24:22.219 }' 00:24:22.219 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.219 13:52:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:22.788 13:52:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:22.788 [2024-06-10 13:52:37.193861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:22.788 [2024-06-10 13:52:37.193898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.789 [2024-06-10 13:52:37.193913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9b2d60 00:24:22.789 [2024-06-10 13:52:37.193920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.789 [2024-06-10 13:52:37.194115] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.789 [2024-06-10 13:52:37.194126] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:22.789 [2024-06-10 13:52:37.194176] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:22.789 [2024-06-10 13:52:37.194183] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:22.789 [2024-06-10 13:52:37.194188] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:22.789 [2024-06-10 13:52:37.194200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:22.789 [2024-06-10 13:52:37.195762] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9b3380 00:24:22.789 [2024-06-10 13:52:37.196906] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:22.789 spare 00:24:22.789 13:52:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:24.171 "name": "raid_bdev1", 00:24:24.171 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:24.171 "strip_size_kb": 0, 00:24:24.171 "state": "online", 00:24:24.171 "raid_level": "raid1", 00:24:24.171 "superblock": true, 00:24:24.171 "num_base_bdevs": 2, 00:24:24.171 "num_base_bdevs_discovered": 2, 00:24:24.171 "num_base_bdevs_operational": 2, 00:24:24.171 "process": { 00:24:24.171 "type": "rebuild", 00:24:24.171 "target": "spare", 00:24:24.171 "progress": { 00:24:24.171 "blocks": 2816, 00:24:24.171 "percent": 35 00:24:24.171 } 00:24:24.171 }, 00:24:24.171 "base_bdevs_list": [ 00:24:24.171 { 00:24:24.171 "name": "spare", 00:24:24.171 "uuid": "fc6f3be2-7bb1-5bc6-9ecc-a86a2af6c762", 00:24:24.171 "is_configured": true, 00:24:24.171 "data_offset": 256, 00:24:24.171 "data_size": 7936 00:24:24.171 }, 00:24:24.171 { 00:24:24.171 "name": "BaseBdev2", 00:24:24.171 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:24.171 "is_configured": true, 00:24:24.171 "data_offset": 256, 00:24:24.171 "data_size": 7936 00:24:24.171 } 00:24:24.171 ] 00:24:24.171 }' 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:24.171 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:24.431 [2024-06-10 13:52:38.694207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:24.432 [2024-06-10 13:52:38.706001] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:24.432 [2024-06-10 13:52:38.706030] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:24.432 [2024-06-10 13:52:38.706040] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:24.432 [2024-06-10 13:52:38.706046] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.432 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.691 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.691 "name": "raid_bdev1", 00:24:24.691 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:24.691 "strip_size_kb": 0, 00:24:24.691 "state": "online", 00:24:24.691 "raid_level": "raid1", 00:24:24.691 "superblock": true, 00:24:24.691 "num_base_bdevs": 2, 00:24:24.691 "num_base_bdevs_discovered": 1, 00:24:24.691 "num_base_bdevs_operational": 1, 00:24:24.691 "base_bdevs_list": [ 00:24:24.691 { 00:24:24.691 "name": null, 00:24:24.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.691 "is_configured": false, 00:24:24.691 "data_offset": 256, 00:24:24.691 "data_size": 7936 00:24:24.691 }, 00:24:24.691 { 00:24:24.691 "name": "BaseBdev2", 00:24:24.691 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:24.691 "is_configured": true, 00:24:24.691 "data_offset": 256, 00:24:24.691 "data_size": 7936 00:24:24.691 } 00:24:24.691 ] 00:24:24.691 }' 00:24:24.691 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.691 13:52:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.261 "name": "raid_bdev1", 00:24:25.261 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:25.261 "strip_size_kb": 0, 00:24:25.261 "state": "online", 00:24:25.261 "raid_level": "raid1", 00:24:25.261 "superblock": true, 00:24:25.261 "num_base_bdevs": 2, 00:24:25.261 "num_base_bdevs_discovered": 1, 00:24:25.261 "num_base_bdevs_operational": 1, 00:24:25.261 "base_bdevs_list": [ 00:24:25.261 { 00:24:25.261 "name": null, 00:24:25.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.261 "is_configured": false, 00:24:25.261 "data_offset": 256, 00:24:25.261 "data_size": 7936 00:24:25.261 }, 00:24:25.261 { 00:24:25.261 "name": "BaseBdev2", 00:24:25.261 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:25.261 "is_configured": true, 00:24:25.261 "data_offset": 256, 00:24:25.261 "data_size": 7936 00:24:25.261 } 00:24:25.261 ] 00:24:25.261 }' 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:25.261 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.521 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:25.521 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:25.521 13:52:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:25.780 [2024-06-10 13:52:40.171674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:25.780 [2024-06-10 13:52:40.171715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:25.780 [2024-06-10 13:52:40.171728] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaf1c80 00:24:25.780 [2024-06-10 13:52:40.171734] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:25.780 [2024-06-10 13:52:40.171909] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:25.780 [2024-06-10 13:52:40.171920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:25.780 [2024-06-10 13:52:40.171954] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:25.780 [2024-06-10 13:52:40.171962] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:25.781 [2024-06-10 13:52:40.171967] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:25.781 BaseBdev1 00:24:25.781 13:52:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.721 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.981 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.981 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.981 "name": "raid_bdev1", 00:24:26.981 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:26.981 "strip_size_kb": 0, 00:24:26.981 "state": "online", 00:24:26.981 "raid_level": "raid1", 00:24:26.981 "superblock": true, 00:24:26.981 "num_base_bdevs": 2, 00:24:26.981 "num_base_bdevs_discovered": 1, 00:24:26.981 "num_base_bdevs_operational": 1, 00:24:26.981 "base_bdevs_list": [ 00:24:26.981 { 00:24:26.981 "name": null, 00:24:26.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.981 "is_configured": false, 00:24:26.981 "data_offset": 256, 00:24:26.981 "data_size": 7936 00:24:26.981 }, 00:24:26.981 { 00:24:26.981 "name": "BaseBdev2", 00:24:26.981 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:26.981 "is_configured": true, 00:24:26.981 "data_offset": 256, 00:24:26.981 "data_size": 7936 00:24:26.981 } 00:24:26.981 ] 00:24:26.981 }' 00:24:26.981 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.981 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.551 13:52:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.811 "name": "raid_bdev1", 00:24:27.811 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:27.811 "strip_size_kb": 0, 00:24:27.811 "state": "online", 00:24:27.811 "raid_level": "raid1", 00:24:27.811 "superblock": true, 00:24:27.811 "num_base_bdevs": 2, 00:24:27.811 "num_base_bdevs_discovered": 1, 00:24:27.811 "num_base_bdevs_operational": 1, 00:24:27.811 "base_bdevs_list": [ 00:24:27.811 { 00:24:27.811 "name": null, 00:24:27.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.811 "is_configured": false, 00:24:27.811 "data_offset": 256, 00:24:27.811 "data_size": 7936 00:24:27.811 }, 00:24:27.811 { 00:24:27.811 "name": "BaseBdev2", 00:24:27.811 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:27.811 "is_configured": true, 00:24:27.811 "data_offset": 256, 00:24:27.811 "data_size": 7936 00:24:27.811 } 00:24:27.811 ] 00:24:27.811 }' 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:27.811 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:28.071 [2024-06-10 13:52:42.469527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:28.071 [2024-06-10 13:52:42.469624] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:28.071 [2024-06-10 13:52:42.469632] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:28.071 request: 00:24:28.071 { 00:24:28.071 "raid_bdev": "raid_bdev1", 00:24:28.071 "base_bdev": "BaseBdev1", 00:24:28.071 "method": "bdev_raid_add_base_bdev", 00:24:28.071 "req_id": 1 00:24:28.071 } 00:24:28.071 Got JSON-RPC error response 00:24:28.071 response: 00:24:28.071 { 00:24:28.071 "code": -22, 00:24:28.071 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:28.071 } 00:24:28.071 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # es=1 00:24:28.071 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:28.071 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:28.071 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:28.071 13:52:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.452 "name": "raid_bdev1", 00:24:29.452 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:29.452 "strip_size_kb": 0, 00:24:29.452 "state": "online", 00:24:29.452 "raid_level": "raid1", 00:24:29.452 "superblock": true, 00:24:29.452 "num_base_bdevs": 2, 00:24:29.452 "num_base_bdevs_discovered": 1, 00:24:29.452 "num_base_bdevs_operational": 1, 00:24:29.452 "base_bdevs_list": [ 00:24:29.452 { 00:24:29.452 "name": null, 00:24:29.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.452 "is_configured": false, 00:24:29.452 "data_offset": 256, 00:24:29.452 "data_size": 7936 00:24:29.452 }, 00:24:29.452 { 00:24:29.452 "name": "BaseBdev2", 00:24:29.452 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:29.452 "is_configured": true, 00:24:29.452 "data_offset": 256, 00:24:29.452 "data_size": 7936 00:24:29.452 } 00:24:29.452 ] 00:24:29.452 }' 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.452 13:52:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.022 "name": "raid_bdev1", 00:24:30.022 "uuid": "ce3152ad-a046-4c2b-8e5b-5ae8e2d3f412", 00:24:30.022 "strip_size_kb": 0, 00:24:30.022 "state": "online", 00:24:30.022 "raid_level": "raid1", 00:24:30.022 "superblock": true, 00:24:30.022 "num_base_bdevs": 2, 00:24:30.022 "num_base_bdevs_discovered": 1, 00:24:30.022 "num_base_bdevs_operational": 1, 00:24:30.022 "base_bdevs_list": [ 00:24:30.022 { 00:24:30.022 "name": null, 00:24:30.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.022 "is_configured": false, 00:24:30.022 "data_offset": 256, 00:24:30.022 "data_size": 7936 00:24:30.022 }, 00:24:30.022 { 00:24:30.022 "name": "BaseBdev2", 00:24:30.022 "uuid": "d344df74-f934-55b2-b5b0-aa4f33ddf1a0", 00:24:30.022 "is_configured": true, 00:24:30.022 "data_offset": 256, 00:24:30.022 "data_size": 7936 00:24:30.022 } 00:24:30.022 ] 00:24:30.022 }' 00:24:30.022 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1679259 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 1679259 ']' 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 1679259 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1679259 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1679259' 00:24:30.282 killing process with pid 1679259 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 1679259 00:24:30.282 Received shutdown signal, test time was about 60.000000 seconds 00:24:30.282 00:24:30.282 Latency(us) 00:24:30.282 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:30.282 =================================================================================================================== 00:24:30.282 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:30.282 [2024-06-10 13:52:44.609993] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:30.282 [2024-06-10 13:52:44.610071] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:30.282 [2024-06-10 13:52:44.610105] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:30.282 [2024-06-10 13:52:44.610112] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb4e140 name raid_bdev1, state offline 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 1679259 00:24:30.282 [2024-06-10 13:52:44.629083] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:24:30.282 00:24:30.282 real 0m28.006s 00:24:30.282 user 0m44.181s 00:24:30.282 sys 0m3.490s 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:30.282 13:52:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:24:30.282 ************************************ 00:24:30.282 END TEST raid_rebuild_test_sb_md_separate 00:24:30.282 ************************************ 00:24:30.542 13:52:44 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:24:30.542 13:52:44 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:24:30.542 13:52:44 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:24:30.542 13:52:44 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:30.542 13:52:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:30.542 ************************************ 00:24:30.542 START TEST raid_state_function_test_sb_md_interleaved 00:24:30.542 ************************************ 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1685115 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1685115' 00:24:30.542 Process raid pid: 1685115 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1685115 /var/tmp/spdk-raid.sock 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 1685115 ']' 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:30.542 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:30.542 13:52:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:30.542 [2024-06-10 13:52:44.893884] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:24:30.542 [2024-06-10 13:52:44.893933] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:30.542 [2024-06-10 13:52:44.981867] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:30.801 [2024-06-10 13:52:45.046906] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:24:30.801 [2024-06-10 13:52:45.088425] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:30.801 [2024-06-10 13:52:45.088446] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:31.370 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:31.370 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:24:31.370 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:31.630 [2024-06-10 13:52:45.936344] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:31.630 [2024-06-10 13:52:45.936373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:31.630 [2024-06-10 13:52:45.936379] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:31.630 [2024-06-10 13:52:45.936386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:31.630 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:31.630 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.631 13:52:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:31.891 13:52:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.891 "name": "Existed_Raid", 00:24:31.891 "uuid": "e86c957d-ca78-41ab-8758-4524f2a8ffa5", 00:24:31.891 "strip_size_kb": 0, 00:24:31.891 "state": "configuring", 00:24:31.891 "raid_level": "raid1", 00:24:31.891 "superblock": true, 00:24:31.891 "num_base_bdevs": 2, 00:24:31.891 "num_base_bdevs_discovered": 0, 00:24:31.891 "num_base_bdevs_operational": 2, 00:24:31.891 "base_bdevs_list": [ 00:24:31.891 { 00:24:31.891 "name": "BaseBdev1", 00:24:31.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.891 "is_configured": false, 00:24:31.891 "data_offset": 0, 00:24:31.891 "data_size": 0 00:24:31.891 }, 00:24:31.891 { 00:24:31.891 "name": "BaseBdev2", 00:24:31.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.891 "is_configured": false, 00:24:31.891 "data_offset": 0, 00:24:31.891 "data_size": 0 00:24:31.891 } 00:24:31.891 ] 00:24:31.891 }' 00:24:31.891 13:52:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.891 13:52:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:32.460 13:52:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:32.460 [2024-06-10 13:52:46.906755] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:32.460 [2024-06-10 13:52:46.906773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ed720 name Existed_Raid, state configuring 00:24:32.460 13:52:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:32.720 [2024-06-10 13:52:47.111279] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:24:32.720 [2024-06-10 13:52:47.111294] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:24:32.720 [2024-06-10 13:52:47.111299] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:32.720 [2024-06-10 13:52:47.111305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:32.720 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:24:32.979 [2024-06-10 13:52:47.310642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:32.979 BaseBdev1 00:24:32.979 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:24:32.979 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:24:32.980 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:32.980 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:24:32.980 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:32.980 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:32.980 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:24:33.240 [ 00:24:33.240 { 00:24:33.240 "name": "BaseBdev1", 00:24:33.240 "aliases": [ 00:24:33.240 "0011b752-3c7d-4978-b42a-e44368d87e22" 00:24:33.240 ], 00:24:33.240 "product_name": "Malloc disk", 00:24:33.240 "block_size": 4128, 00:24:33.240 "num_blocks": 8192, 00:24:33.240 "uuid": "0011b752-3c7d-4978-b42a-e44368d87e22", 00:24:33.240 "md_size": 32, 00:24:33.240 "md_interleave": true, 00:24:33.240 "dif_type": 0, 00:24:33.240 "assigned_rate_limits": { 00:24:33.240 "rw_ios_per_sec": 0, 00:24:33.240 "rw_mbytes_per_sec": 0, 00:24:33.240 "r_mbytes_per_sec": 0, 00:24:33.240 "w_mbytes_per_sec": 0 00:24:33.240 }, 00:24:33.240 "claimed": true, 00:24:33.240 "claim_type": "exclusive_write", 00:24:33.240 "zoned": false, 00:24:33.240 "supported_io_types": { 00:24:33.240 "read": true, 00:24:33.240 "write": true, 00:24:33.240 "unmap": true, 00:24:33.240 "write_zeroes": true, 00:24:33.240 "flush": true, 00:24:33.240 "reset": true, 00:24:33.240 "compare": false, 00:24:33.240 "compare_and_write": false, 00:24:33.240 "abort": true, 00:24:33.240 "nvme_admin": false, 00:24:33.240 "nvme_io": false 00:24:33.240 }, 00:24:33.240 "memory_domains": [ 00:24:33.240 { 00:24:33.240 "dma_device_id": "system", 00:24:33.240 "dma_device_type": 1 00:24:33.240 }, 00:24:33.240 { 00:24:33.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:33.240 "dma_device_type": 2 00:24:33.240 } 00:24:33.240 ], 00:24:33.240 "driver_specific": {} 00:24:33.240 } 00:24:33.240 ] 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.240 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.500 "name": "Existed_Raid", 00:24:33.500 "uuid": "2bc5741a-f50b-4944-95d7-f82a4187dd22", 00:24:33.500 "strip_size_kb": 0, 00:24:33.500 "state": "configuring", 00:24:33.500 "raid_level": "raid1", 00:24:33.500 "superblock": true, 00:24:33.500 "num_base_bdevs": 2, 00:24:33.500 "num_base_bdevs_discovered": 1, 00:24:33.500 "num_base_bdevs_operational": 2, 00:24:33.500 "base_bdevs_list": [ 00:24:33.500 { 00:24:33.500 "name": "BaseBdev1", 00:24:33.500 "uuid": "0011b752-3c7d-4978-b42a-e44368d87e22", 00:24:33.500 "is_configured": true, 00:24:33.500 "data_offset": 256, 00:24:33.500 "data_size": 7936 00:24:33.500 }, 00:24:33.500 { 00:24:33.500 "name": "BaseBdev2", 00:24:33.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.500 "is_configured": false, 00:24:33.500 "data_offset": 0, 00:24:33.500 "data_size": 0 00:24:33.500 } 00:24:33.500 ] 00:24:33.500 }' 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.500 13:52:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:34.071 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:34.334 [2024-06-10 13:52:48.581882] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:34.334 [2024-06-10 13:52:48.581908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ed010 name Existed_Raid, state configuring 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:24:34.334 [2024-06-10 13:52:48.786427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:34.334 [2024-06-10 13:52:48.787633] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:24:34.334 [2024-06-10 13:52:48.787656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.334 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:34.335 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.335 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.335 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.335 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.335 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.639 13:52:48 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:34.639 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.639 "name": "Existed_Raid", 00:24:34.639 "uuid": "6f7e3c0c-ebf7-41c0-81de-f5649af1c281", 00:24:34.639 "strip_size_kb": 0, 00:24:34.639 "state": "configuring", 00:24:34.639 "raid_level": "raid1", 00:24:34.639 "superblock": true, 00:24:34.639 "num_base_bdevs": 2, 00:24:34.639 "num_base_bdevs_discovered": 1, 00:24:34.639 "num_base_bdevs_operational": 2, 00:24:34.639 "base_bdevs_list": [ 00:24:34.639 { 00:24:34.639 "name": "BaseBdev1", 00:24:34.639 "uuid": "0011b752-3c7d-4978-b42a-e44368d87e22", 00:24:34.639 "is_configured": true, 00:24:34.639 "data_offset": 256, 00:24:34.639 "data_size": 7936 00:24:34.639 }, 00:24:34.639 { 00:24:34.639 "name": "BaseBdev2", 00:24:34.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.639 "is_configured": false, 00:24:34.639 "data_offset": 0, 00:24:34.639 "data_size": 0 00:24:34.639 } 00:24:34.639 ] 00:24:34.639 }' 00:24:34.639 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.639 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:35.238 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:24:35.498 [2024-06-10 13:52:49.770071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:35.498 [2024-06-10 13:52:49.770179] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20eee90 00:24:35.498 [2024-06-10 13:52:49.770187] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:35.498 [2024-06-10 13:52:49.770231] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ec830 00:24:35.498 [2024-06-10 13:52:49.770290] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20eee90 00:24:35.498 [2024-06-10 13:52:49.770296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20eee90 00:24:35.498 [2024-06-10 13:52:49.770338] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:35.498 BaseBdev2 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:24:35.498 13:52:49 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:24:35.758 [ 00:24:35.758 { 00:24:35.758 "name": "BaseBdev2", 00:24:35.758 "aliases": [ 00:24:35.758 "19980f8b-9289-4f0c-9531-c82b3a066b14" 00:24:35.758 ], 00:24:35.758 "product_name": "Malloc disk", 00:24:35.758 "block_size": 4128, 00:24:35.758 "num_blocks": 8192, 00:24:35.758 "uuid": "19980f8b-9289-4f0c-9531-c82b3a066b14", 00:24:35.758 "md_size": 32, 00:24:35.758 "md_interleave": true, 00:24:35.758 "dif_type": 0, 00:24:35.758 "assigned_rate_limits": { 00:24:35.758 "rw_ios_per_sec": 0, 00:24:35.758 "rw_mbytes_per_sec": 0, 00:24:35.758 "r_mbytes_per_sec": 0, 00:24:35.758 "w_mbytes_per_sec": 0 00:24:35.758 }, 00:24:35.758 "claimed": true, 00:24:35.758 "claim_type": "exclusive_write", 00:24:35.758 "zoned": false, 00:24:35.758 "supported_io_types": { 00:24:35.758 "read": true, 00:24:35.758 "write": true, 00:24:35.758 "unmap": true, 00:24:35.758 "write_zeroes": true, 00:24:35.758 "flush": true, 00:24:35.758 "reset": true, 00:24:35.758 "compare": false, 00:24:35.758 "compare_and_write": false, 00:24:35.758 "abort": true, 00:24:35.758 "nvme_admin": false, 00:24:35.758 "nvme_io": false 00:24:35.758 }, 00:24:35.758 "memory_domains": [ 00:24:35.758 { 00:24:35.758 "dma_device_id": "system", 00:24:35.758 "dma_device_type": 1 00:24:35.758 }, 00:24:35.758 { 00:24:35.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:35.758 "dma_device_type": 2 00:24:35.758 } 00:24:35.758 ], 00:24:35.758 "driver_specific": {} 00:24:35.758 } 00:24:35.758 ] 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.758 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:36.018 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.018 "name": "Existed_Raid", 00:24:36.018 "uuid": "6f7e3c0c-ebf7-41c0-81de-f5649af1c281", 00:24:36.018 "strip_size_kb": 0, 00:24:36.018 "state": "online", 00:24:36.018 "raid_level": "raid1", 00:24:36.018 "superblock": true, 00:24:36.018 "num_base_bdevs": 2, 00:24:36.018 "num_base_bdevs_discovered": 2, 00:24:36.018 "num_base_bdevs_operational": 2, 00:24:36.018 "base_bdevs_list": [ 00:24:36.018 { 00:24:36.018 "name": "BaseBdev1", 00:24:36.018 "uuid": "0011b752-3c7d-4978-b42a-e44368d87e22", 00:24:36.018 "is_configured": true, 00:24:36.018 "data_offset": 256, 00:24:36.018 "data_size": 7936 00:24:36.018 }, 00:24:36.018 { 00:24:36.018 "name": "BaseBdev2", 00:24:36.018 "uuid": "19980f8b-9289-4f0c-9531-c82b3a066b14", 00:24:36.018 "is_configured": true, 00:24:36.018 "data_offset": 256, 00:24:36.018 "data_size": 7936 00:24:36.018 } 00:24:36.018 ] 00:24:36.018 }' 00:24:36.018 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.018 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:36.588 [2024-06-10 13:52:50.969334] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:36.588 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:36.588 "name": "Existed_Raid", 00:24:36.588 "aliases": [ 00:24:36.588 "6f7e3c0c-ebf7-41c0-81de-f5649af1c281" 00:24:36.588 ], 00:24:36.588 "product_name": "Raid Volume", 00:24:36.588 "block_size": 4128, 00:24:36.588 "num_blocks": 7936, 00:24:36.588 "uuid": "6f7e3c0c-ebf7-41c0-81de-f5649af1c281", 00:24:36.588 "md_size": 32, 00:24:36.588 "md_interleave": true, 00:24:36.588 "dif_type": 0, 00:24:36.588 "assigned_rate_limits": { 00:24:36.588 "rw_ios_per_sec": 0, 00:24:36.588 "rw_mbytes_per_sec": 0, 00:24:36.588 "r_mbytes_per_sec": 0, 00:24:36.588 "w_mbytes_per_sec": 0 00:24:36.588 }, 00:24:36.588 "claimed": false, 00:24:36.588 "zoned": false, 00:24:36.588 "supported_io_types": { 00:24:36.588 "read": true, 00:24:36.588 "write": true, 00:24:36.588 "unmap": false, 00:24:36.588 "write_zeroes": true, 00:24:36.588 "flush": false, 00:24:36.588 "reset": true, 00:24:36.588 "compare": false, 00:24:36.589 "compare_and_write": false, 00:24:36.589 "abort": false, 00:24:36.589 "nvme_admin": false, 00:24:36.589 "nvme_io": false 00:24:36.589 }, 00:24:36.589 "memory_domains": [ 00:24:36.589 { 00:24:36.589 "dma_device_id": "system", 00:24:36.589 "dma_device_type": 1 00:24:36.589 }, 00:24:36.589 { 00:24:36.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:36.589 "dma_device_type": 2 00:24:36.589 }, 00:24:36.589 { 00:24:36.589 "dma_device_id": "system", 00:24:36.589 "dma_device_type": 1 00:24:36.589 }, 00:24:36.589 { 00:24:36.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:36.589 "dma_device_type": 2 00:24:36.589 } 00:24:36.589 ], 00:24:36.589 "driver_specific": { 00:24:36.589 "raid": { 00:24:36.589 "uuid": "6f7e3c0c-ebf7-41c0-81de-f5649af1c281", 00:24:36.589 "strip_size_kb": 0, 00:24:36.589 "state": "online", 00:24:36.589 "raid_level": "raid1", 00:24:36.589 "superblock": true, 00:24:36.589 "num_base_bdevs": 2, 00:24:36.589 "num_base_bdevs_discovered": 2, 00:24:36.589 "num_base_bdevs_operational": 2, 00:24:36.589 "base_bdevs_list": [ 00:24:36.589 { 00:24:36.589 "name": "BaseBdev1", 00:24:36.589 "uuid": "0011b752-3c7d-4978-b42a-e44368d87e22", 00:24:36.589 "is_configured": true, 00:24:36.589 "data_offset": 256, 00:24:36.589 "data_size": 7936 00:24:36.589 }, 00:24:36.589 { 00:24:36.589 "name": "BaseBdev2", 00:24:36.589 "uuid": "19980f8b-9289-4f0c-9531-c82b3a066b14", 00:24:36.589 "is_configured": true, 00:24:36.589 "data_offset": 256, 00:24:36.589 "data_size": 7936 00:24:36.589 } 00:24:36.589 ] 00:24:36.589 } 00:24:36.589 } 00:24:36.589 }' 00:24:36.589 13:52:50 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:36.589 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:24:36.589 BaseBdev2' 00:24:36.589 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:36.589 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:24:36.589 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:36.849 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:36.849 "name": "BaseBdev1", 00:24:36.849 "aliases": [ 00:24:36.849 "0011b752-3c7d-4978-b42a-e44368d87e22" 00:24:36.849 ], 00:24:36.849 "product_name": "Malloc disk", 00:24:36.849 "block_size": 4128, 00:24:36.849 "num_blocks": 8192, 00:24:36.849 "uuid": "0011b752-3c7d-4978-b42a-e44368d87e22", 00:24:36.849 "md_size": 32, 00:24:36.849 "md_interleave": true, 00:24:36.849 "dif_type": 0, 00:24:36.849 "assigned_rate_limits": { 00:24:36.849 "rw_ios_per_sec": 0, 00:24:36.849 "rw_mbytes_per_sec": 0, 00:24:36.849 "r_mbytes_per_sec": 0, 00:24:36.849 "w_mbytes_per_sec": 0 00:24:36.849 }, 00:24:36.849 "claimed": true, 00:24:36.849 "claim_type": "exclusive_write", 00:24:36.849 "zoned": false, 00:24:36.849 "supported_io_types": { 00:24:36.849 "read": true, 00:24:36.849 "write": true, 00:24:36.849 "unmap": true, 00:24:36.849 "write_zeroes": true, 00:24:36.849 "flush": true, 00:24:36.849 "reset": true, 00:24:36.849 "compare": false, 00:24:36.849 "compare_and_write": false, 00:24:36.849 "abort": true, 00:24:36.849 "nvme_admin": false, 00:24:36.849 "nvme_io": false 00:24:36.849 }, 00:24:36.849 "memory_domains": [ 00:24:36.849 { 00:24:36.849 "dma_device_id": "system", 00:24:36.849 "dma_device_type": 1 00:24:36.849 }, 00:24:36.849 { 00:24:36.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:36.849 "dma_device_type": 2 00:24:36.849 } 00:24:36.849 ], 00:24:36.849 "driver_specific": {} 00:24:36.849 }' 00:24:36.849 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:36.849 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:36.849 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:24:36.849 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:37.109 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:37.368 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:37.368 "name": "BaseBdev2", 00:24:37.368 "aliases": [ 00:24:37.368 "19980f8b-9289-4f0c-9531-c82b3a066b14" 00:24:37.368 ], 00:24:37.368 "product_name": "Malloc disk", 00:24:37.368 "block_size": 4128, 00:24:37.369 "num_blocks": 8192, 00:24:37.369 "uuid": "19980f8b-9289-4f0c-9531-c82b3a066b14", 00:24:37.369 "md_size": 32, 00:24:37.369 "md_interleave": true, 00:24:37.369 "dif_type": 0, 00:24:37.369 "assigned_rate_limits": { 00:24:37.369 "rw_ios_per_sec": 0, 00:24:37.369 "rw_mbytes_per_sec": 0, 00:24:37.369 "r_mbytes_per_sec": 0, 00:24:37.369 "w_mbytes_per_sec": 0 00:24:37.369 }, 00:24:37.369 "claimed": true, 00:24:37.369 "claim_type": "exclusive_write", 00:24:37.369 "zoned": false, 00:24:37.369 "supported_io_types": { 00:24:37.369 "read": true, 00:24:37.369 "write": true, 00:24:37.369 "unmap": true, 00:24:37.369 "write_zeroes": true, 00:24:37.369 "flush": true, 00:24:37.369 "reset": true, 00:24:37.369 "compare": false, 00:24:37.369 "compare_and_write": false, 00:24:37.369 "abort": true, 00:24:37.369 "nvme_admin": false, 00:24:37.369 "nvme_io": false 00:24:37.369 }, 00:24:37.369 "memory_domains": [ 00:24:37.369 { 00:24:37.369 "dma_device_id": "system", 00:24:37.369 "dma_device_type": 1 00:24:37.369 }, 00:24:37.369 { 00:24:37.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:37.369 "dma_device_type": 2 00:24:37.369 } 00:24:37.369 ], 00:24:37.369 "driver_specific": {} 00:24:37.369 }' 00:24:37.369 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:37.369 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:37.628 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:24:37.628 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.628 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:37.628 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:37.628 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:37.628 13:52:51 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:37.628 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:24:37.629 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:37.629 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:24:37.888 [2024-06-10 13:52:52.308558] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.888 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:38.147 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:38.147 "name": "Existed_Raid", 00:24:38.147 "uuid": "6f7e3c0c-ebf7-41c0-81de-f5649af1c281", 00:24:38.147 "strip_size_kb": 0, 00:24:38.147 "state": "online", 00:24:38.147 "raid_level": "raid1", 00:24:38.147 "superblock": true, 00:24:38.147 "num_base_bdevs": 2, 00:24:38.147 "num_base_bdevs_discovered": 1, 00:24:38.147 "num_base_bdevs_operational": 1, 00:24:38.147 "base_bdevs_list": [ 00:24:38.147 { 00:24:38.147 "name": null, 00:24:38.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.147 "is_configured": false, 00:24:38.147 "data_offset": 256, 00:24:38.147 "data_size": 7936 00:24:38.147 }, 00:24:38.147 { 00:24:38.147 "name": "BaseBdev2", 00:24:38.147 "uuid": "19980f8b-9289-4f0c-9531-c82b3a066b14", 00:24:38.147 "is_configured": true, 00:24:38.147 "data_offset": 256, 00:24:38.147 "data_size": 7936 00:24:38.147 } 00:24:38.147 ] 00:24:38.147 }' 00:24:38.147 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:38.147 13:52:52 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:38.717 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:24:38.717 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:38.717 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.717 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:24:38.976 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:24:38.976 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:24:38.976 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:24:39.236 [2024-06-10 13:52:53.487550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:39.236 [2024-06-10 13:52:53.487617] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:39.236 [2024-06-10 13:52:53.494051] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:39.236 [2024-06-10 13:52:53.494076] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:39.236 [2024-06-10 13:52:53.494082] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20eee90 name Existed_Raid, state offline 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:24:39.236 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1685115 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 1685115 ']' 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 1685115 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1685115 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1685115' 00:24:39.497 killing process with pid 1685115 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 1685115 00:24:39.497 [2024-06-10 13:52:53.765818] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 1685115 00:24:39.497 [2024-06-10 13:52:53.766434] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:24:39.497 00:24:39.497 real 0m9.057s 00:24:39.497 user 0m16.470s 00:24:39.497 sys 0m1.369s 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:39.497 13:52:53 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:39.497 ************************************ 00:24:39.497 END TEST raid_state_function_test_sb_md_interleaved 00:24:39.497 ************************************ 00:24:39.497 13:52:53 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:24:39.497 13:52:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:24:39.497 13:52:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:39.497 13:52:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:39.497 ************************************ 00:24:39.497 START TEST raid_superblock_test_md_interleaved 00:24:39.497 ************************************ 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:24:39.497 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1687001 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1687001 /var/tmp/spdk-raid.sock 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 1687001 ']' 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:39.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:39.757 13:52:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:39.757 [2024-06-10 13:52:54.022971] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:24:39.757 [2024-06-10 13:52:54.023018] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1687001 ] 00:24:39.757 [2024-06-10 13:52:54.112923] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:39.757 [2024-06-10 13:52:54.187360] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:24:39.757 [2024-06-10 13:52:54.230445] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:39.757 [2024-06-10 13:52:54.230470] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:40.697 13:52:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:24:40.697 malloc1 00:24:40.697 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:40.957 [2024-06-10 13:52:55.265845] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:40.957 [2024-06-10 13:52:55.265879] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:40.957 [2024-06-10 13:52:55.265897] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10cbe70 00:24:40.957 [2024-06-10 13:52:55.265904] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:40.957 [2024-06-10 13:52:55.267120] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:40.957 [2024-06-10 13:52:55.267139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:40.957 pt1 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:40.957 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:24:41.218 malloc2 00:24:41.218 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:41.218 [2024-06-10 13:52:55.656881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:41.218 [2024-06-10 13:52:55.656906] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:41.218 [2024-06-10 13:52:55.656915] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1259610 00:24:41.218 [2024-06-10 13:52:55.656922] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:41.218 [2024-06-10 13:52:55.658064] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:41.218 [2024-06-10 13:52:55.658082] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:41.218 pt2 00:24:41.218 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:41.218 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:41.218 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:24:41.479 [2024-06-10 13:52:55.849374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:41.479 [2024-06-10 13:52:55.850482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:41.479 [2024-06-10 13:52:55.850601] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x125b0d0 00:24:41.479 [2024-06-10 13:52:55.850610] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:41.479 [2024-06-10 13:52:55.850657] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10ca030 00:24:41.479 [2024-06-10 13:52:55.850723] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x125b0d0 00:24:41.479 [2024-06-10 13:52:55.850729] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x125b0d0 00:24:41.479 [2024-06-10 13:52:55.850772] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.479 13:52:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.739 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.739 "name": "raid_bdev1", 00:24:41.739 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:41.739 "strip_size_kb": 0, 00:24:41.739 "state": "online", 00:24:41.739 "raid_level": "raid1", 00:24:41.739 "superblock": true, 00:24:41.739 "num_base_bdevs": 2, 00:24:41.739 "num_base_bdevs_discovered": 2, 00:24:41.739 "num_base_bdevs_operational": 2, 00:24:41.739 "base_bdevs_list": [ 00:24:41.739 { 00:24:41.739 "name": "pt1", 00:24:41.739 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:41.739 "is_configured": true, 00:24:41.739 "data_offset": 256, 00:24:41.739 "data_size": 7936 00:24:41.739 }, 00:24:41.739 { 00:24:41.739 "name": "pt2", 00:24:41.739 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:41.739 "is_configured": true, 00:24:41.739 "data_offset": 256, 00:24:41.739 "data_size": 7936 00:24:41.739 } 00:24:41.739 ] 00:24:41.739 }' 00:24:41.739 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.739 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:42.309 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:42.309 [2024-06-10 13:52:56.771897] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:42.569 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:42.569 "name": "raid_bdev1", 00:24:42.569 "aliases": [ 00:24:42.569 "d6ccee2c-1e1f-41ca-94d6-5e0303995e62" 00:24:42.569 ], 00:24:42.569 "product_name": "Raid Volume", 00:24:42.569 "block_size": 4128, 00:24:42.569 "num_blocks": 7936, 00:24:42.569 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:42.569 "md_size": 32, 00:24:42.569 "md_interleave": true, 00:24:42.569 "dif_type": 0, 00:24:42.569 "assigned_rate_limits": { 00:24:42.569 "rw_ios_per_sec": 0, 00:24:42.569 "rw_mbytes_per_sec": 0, 00:24:42.569 "r_mbytes_per_sec": 0, 00:24:42.569 "w_mbytes_per_sec": 0 00:24:42.569 }, 00:24:42.569 "claimed": false, 00:24:42.569 "zoned": false, 00:24:42.569 "supported_io_types": { 00:24:42.569 "read": true, 00:24:42.569 "write": true, 00:24:42.569 "unmap": false, 00:24:42.569 "write_zeroes": true, 00:24:42.569 "flush": false, 00:24:42.569 "reset": true, 00:24:42.569 "compare": false, 00:24:42.569 "compare_and_write": false, 00:24:42.569 "abort": false, 00:24:42.569 "nvme_admin": false, 00:24:42.569 "nvme_io": false 00:24:42.569 }, 00:24:42.569 "memory_domains": [ 00:24:42.569 { 00:24:42.569 "dma_device_id": "system", 00:24:42.569 "dma_device_type": 1 00:24:42.569 }, 00:24:42.569 { 00:24:42.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.569 "dma_device_type": 2 00:24:42.569 }, 00:24:42.569 { 00:24:42.569 "dma_device_id": "system", 00:24:42.569 "dma_device_type": 1 00:24:42.569 }, 00:24:42.569 { 00:24:42.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.569 "dma_device_type": 2 00:24:42.569 } 00:24:42.569 ], 00:24:42.569 "driver_specific": { 00:24:42.569 "raid": { 00:24:42.569 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:42.569 "strip_size_kb": 0, 00:24:42.569 "state": "online", 00:24:42.569 "raid_level": "raid1", 00:24:42.569 "superblock": true, 00:24:42.569 "num_base_bdevs": 2, 00:24:42.569 "num_base_bdevs_discovered": 2, 00:24:42.569 "num_base_bdevs_operational": 2, 00:24:42.569 "base_bdevs_list": [ 00:24:42.569 { 00:24:42.569 "name": "pt1", 00:24:42.569 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:42.569 "is_configured": true, 00:24:42.569 "data_offset": 256, 00:24:42.569 "data_size": 7936 00:24:42.569 }, 00:24:42.569 { 00:24:42.569 "name": "pt2", 00:24:42.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:42.569 "is_configured": true, 00:24:42.569 "data_offset": 256, 00:24:42.569 "data_size": 7936 00:24:42.569 } 00:24:42.569 ] 00:24:42.569 } 00:24:42.569 } 00:24:42.569 }' 00:24:42.569 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:42.569 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:42.569 pt2' 00:24:42.569 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:42.569 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:42.569 13:52:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:42.569 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:42.569 "name": "pt1", 00:24:42.569 "aliases": [ 00:24:42.569 "00000000-0000-0000-0000-000000000001" 00:24:42.569 ], 00:24:42.569 "product_name": "passthru", 00:24:42.569 "block_size": 4128, 00:24:42.569 "num_blocks": 8192, 00:24:42.569 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:42.569 "md_size": 32, 00:24:42.569 "md_interleave": true, 00:24:42.569 "dif_type": 0, 00:24:42.569 "assigned_rate_limits": { 00:24:42.569 "rw_ios_per_sec": 0, 00:24:42.569 "rw_mbytes_per_sec": 0, 00:24:42.569 "r_mbytes_per_sec": 0, 00:24:42.569 "w_mbytes_per_sec": 0 00:24:42.569 }, 00:24:42.569 "claimed": true, 00:24:42.569 "claim_type": "exclusive_write", 00:24:42.569 "zoned": false, 00:24:42.569 "supported_io_types": { 00:24:42.569 "read": true, 00:24:42.569 "write": true, 00:24:42.569 "unmap": true, 00:24:42.569 "write_zeroes": true, 00:24:42.569 "flush": true, 00:24:42.569 "reset": true, 00:24:42.569 "compare": false, 00:24:42.569 "compare_and_write": false, 00:24:42.569 "abort": true, 00:24:42.569 "nvme_admin": false, 00:24:42.569 "nvme_io": false 00:24:42.569 }, 00:24:42.569 "memory_domains": [ 00:24:42.569 { 00:24:42.569 "dma_device_id": "system", 00:24:42.569 "dma_device_type": 1 00:24:42.569 }, 00:24:42.569 { 00:24:42.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:42.569 "dma_device_type": 2 00:24:42.569 } 00:24:42.569 ], 00:24:42.569 "driver_specific": { 00:24:42.569 "passthru": { 00:24:42.569 "name": "pt1", 00:24:42.569 "base_bdev_name": "malloc1" 00:24:42.569 } 00:24:42.569 } 00:24:42.569 }' 00:24:42.569 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:42.830 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:43.090 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:43.351 "name": "pt2", 00:24:43.351 "aliases": [ 00:24:43.351 "00000000-0000-0000-0000-000000000002" 00:24:43.351 ], 00:24:43.351 "product_name": "passthru", 00:24:43.351 "block_size": 4128, 00:24:43.351 "num_blocks": 8192, 00:24:43.351 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:43.351 "md_size": 32, 00:24:43.351 "md_interleave": true, 00:24:43.351 "dif_type": 0, 00:24:43.351 "assigned_rate_limits": { 00:24:43.351 "rw_ios_per_sec": 0, 00:24:43.351 "rw_mbytes_per_sec": 0, 00:24:43.351 "r_mbytes_per_sec": 0, 00:24:43.351 "w_mbytes_per_sec": 0 00:24:43.351 }, 00:24:43.351 "claimed": true, 00:24:43.351 "claim_type": "exclusive_write", 00:24:43.351 "zoned": false, 00:24:43.351 "supported_io_types": { 00:24:43.351 "read": true, 00:24:43.351 "write": true, 00:24:43.351 "unmap": true, 00:24:43.351 "write_zeroes": true, 00:24:43.351 "flush": true, 00:24:43.351 "reset": true, 00:24:43.351 "compare": false, 00:24:43.351 "compare_and_write": false, 00:24:43.351 "abort": true, 00:24:43.351 "nvme_admin": false, 00:24:43.351 "nvme_io": false 00:24:43.351 }, 00:24:43.351 "memory_domains": [ 00:24:43.351 { 00:24:43.351 "dma_device_id": "system", 00:24:43.351 "dma_device_type": 1 00:24:43.351 }, 00:24:43.351 { 00:24:43.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:43.351 "dma_device_type": 2 00:24:43.351 } 00:24:43.351 ], 00:24:43.351 "driver_specific": { 00:24:43.351 "passthru": { 00:24:43.351 "name": "pt2", 00:24:43.351 "base_bdev_name": "malloc2" 00:24:43.351 } 00:24:43.351 } 00:24:43.351 }' 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:43.351 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:43.612 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:24:43.612 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:43.612 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:43.612 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:43.612 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:43.612 13:52:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:24:43.872 [2024-06-10 13:52:58.123313] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:43.872 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d6ccee2c-1e1f-41ca-94d6-5e0303995e62 00:24:43.872 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z d6ccee2c-1e1f-41ca-94d6-5e0303995e62 ']' 00:24:43.872 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:43.872 [2024-06-10 13:52:58.315607] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:43.872 [2024-06-10 13:52:58.315618] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:43.872 [2024-06-10 13:52:58.315659] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:43.872 [2024-06-10 13:52:58.315705] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:43.872 [2024-06-10 13:52:58.315714] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125b0d0 name raid_bdev1, state offline 00:24:43.872 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.872 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:24:44.133 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:24:44.133 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:24:44.133 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:44.133 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:44.393 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:44.393 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:44.654 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:44.654 13:52:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:44.654 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:24:44.915 [2024-06-10 13:52:59.302064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:44.915 [2024-06-10 13:52:59.303224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:44.915 [2024-06-10 13:52:59.303266] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:44.915 [2024-06-10 13:52:59.303296] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:44.915 [2024-06-10 13:52:59.303308] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:44.915 [2024-06-10 13:52:59.303314] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10cb7b0 name raid_bdev1, state configuring 00:24:44.915 request: 00:24:44.915 { 00:24:44.915 "name": "raid_bdev1", 00:24:44.915 "raid_level": "raid1", 00:24:44.915 "base_bdevs": [ 00:24:44.915 "malloc1", 00:24:44.915 "malloc2" 00:24:44.915 ], 00:24:44.915 "superblock": false, 00:24:44.915 "method": "bdev_raid_create", 00:24:44.915 "req_id": 1 00:24:44.915 } 00:24:44.915 Got JSON-RPC error response 00:24:44.915 response: 00:24:44.915 { 00:24:44.915 "code": -17, 00:24:44.915 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:44.915 } 00:24:44.915 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:24:44.915 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:44.915 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:44.915 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:44.915 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.915 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:24:45.176 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:24:45.176 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:24:45.176 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:45.437 [2024-06-10 13:52:59.699026] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:45.437 [2024-06-10 13:52:59.699047] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:45.437 [2024-06-10 13:52:59.699059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10ca220 00:24:45.437 [2024-06-10 13:52:59.699066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:45.437 [2024-06-10 13:52:59.700246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:45.437 [2024-06-10 13:52:59.700264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:45.437 [2024-06-10 13:52:59.700293] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:45.437 [2024-06-10 13:52:59.700310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:45.437 pt1 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.437 "name": "raid_bdev1", 00:24:45.437 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:45.437 "strip_size_kb": 0, 00:24:45.437 "state": "configuring", 00:24:45.437 "raid_level": "raid1", 00:24:45.437 "superblock": true, 00:24:45.437 "num_base_bdevs": 2, 00:24:45.437 "num_base_bdevs_discovered": 1, 00:24:45.437 "num_base_bdevs_operational": 2, 00:24:45.437 "base_bdevs_list": [ 00:24:45.437 { 00:24:45.437 "name": "pt1", 00:24:45.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:45.437 "is_configured": true, 00:24:45.437 "data_offset": 256, 00:24:45.437 "data_size": 7936 00:24:45.437 }, 00:24:45.437 { 00:24:45.437 "name": null, 00:24:45.437 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:45.437 "is_configured": false, 00:24:45.437 "data_offset": 256, 00:24:45.437 "data_size": 7936 00:24:45.437 } 00:24:45.437 ] 00:24:45.437 }' 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.437 13:52:59 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:46.010 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:24:46.010 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:24:46.010 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:46.010 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:46.271 [2024-06-10 13:53:00.505082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:46.271 [2024-06-10 13:53:00.505117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:46.271 [2024-06-10 13:53:00.505128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125c930 00:24:46.271 [2024-06-10 13:53:00.505135] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:46.271 [2024-06-10 13:53:00.505271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:46.271 [2024-06-10 13:53:00.505281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:46.271 [2024-06-10 13:53:00.505312] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:46.271 [2024-06-10 13:53:00.505324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:46.271 [2024-06-10 13:53:00.505393] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10cabd0 00:24:46.271 [2024-06-10 13:53:00.505399] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:46.271 [2024-06-10 13:53:00.505441] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1259da0 00:24:46.271 [2024-06-10 13:53:00.505501] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10cabd0 00:24:46.271 [2024-06-10 13:53:00.505506] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10cabd0 00:24:46.271 [2024-06-10 13:53:00.505553] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:46.271 pt2 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.271 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.272 "name": "raid_bdev1", 00:24:46.272 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:46.272 "strip_size_kb": 0, 00:24:46.272 "state": "online", 00:24:46.272 "raid_level": "raid1", 00:24:46.272 "superblock": true, 00:24:46.272 "num_base_bdevs": 2, 00:24:46.272 "num_base_bdevs_discovered": 2, 00:24:46.272 "num_base_bdevs_operational": 2, 00:24:46.272 "base_bdevs_list": [ 00:24:46.272 { 00:24:46.272 "name": "pt1", 00:24:46.272 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:46.272 "is_configured": true, 00:24:46.272 "data_offset": 256, 00:24:46.272 "data_size": 7936 00:24:46.272 }, 00:24:46.272 { 00:24:46.272 "name": "pt2", 00:24:46.272 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:46.272 "is_configured": true, 00:24:46.272 "data_offset": 256, 00:24:46.272 "data_size": 7936 00:24:46.272 } 00:24:46.272 ] 00:24:46.272 }' 00:24:46.272 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.272 13:53:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:46.842 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:47.104 [2024-06-10 13:53:01.427607] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:47.104 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:47.104 "name": "raid_bdev1", 00:24:47.104 "aliases": [ 00:24:47.104 "d6ccee2c-1e1f-41ca-94d6-5e0303995e62" 00:24:47.104 ], 00:24:47.104 "product_name": "Raid Volume", 00:24:47.104 "block_size": 4128, 00:24:47.104 "num_blocks": 7936, 00:24:47.104 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:47.104 "md_size": 32, 00:24:47.104 "md_interleave": true, 00:24:47.104 "dif_type": 0, 00:24:47.104 "assigned_rate_limits": { 00:24:47.104 "rw_ios_per_sec": 0, 00:24:47.104 "rw_mbytes_per_sec": 0, 00:24:47.104 "r_mbytes_per_sec": 0, 00:24:47.104 "w_mbytes_per_sec": 0 00:24:47.104 }, 00:24:47.104 "claimed": false, 00:24:47.104 "zoned": false, 00:24:47.104 "supported_io_types": { 00:24:47.104 "read": true, 00:24:47.104 "write": true, 00:24:47.104 "unmap": false, 00:24:47.104 "write_zeroes": true, 00:24:47.104 "flush": false, 00:24:47.104 "reset": true, 00:24:47.104 "compare": false, 00:24:47.104 "compare_and_write": false, 00:24:47.104 "abort": false, 00:24:47.104 "nvme_admin": false, 00:24:47.104 "nvme_io": false 00:24:47.104 }, 00:24:47.104 "memory_domains": [ 00:24:47.104 { 00:24:47.104 "dma_device_id": "system", 00:24:47.104 "dma_device_type": 1 00:24:47.104 }, 00:24:47.104 { 00:24:47.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:47.104 "dma_device_type": 2 00:24:47.104 }, 00:24:47.104 { 00:24:47.104 "dma_device_id": "system", 00:24:47.104 "dma_device_type": 1 00:24:47.104 }, 00:24:47.104 { 00:24:47.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:47.104 "dma_device_type": 2 00:24:47.104 } 00:24:47.104 ], 00:24:47.104 "driver_specific": { 00:24:47.104 "raid": { 00:24:47.104 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:47.104 "strip_size_kb": 0, 00:24:47.104 "state": "online", 00:24:47.104 "raid_level": "raid1", 00:24:47.104 "superblock": true, 00:24:47.104 "num_base_bdevs": 2, 00:24:47.104 "num_base_bdevs_discovered": 2, 00:24:47.104 "num_base_bdevs_operational": 2, 00:24:47.104 "base_bdevs_list": [ 00:24:47.104 { 00:24:47.104 "name": "pt1", 00:24:47.104 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:47.104 "is_configured": true, 00:24:47.104 "data_offset": 256, 00:24:47.104 "data_size": 7936 00:24:47.104 }, 00:24:47.104 { 00:24:47.104 "name": "pt2", 00:24:47.104 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:47.104 "is_configured": true, 00:24:47.104 "data_offset": 256, 00:24:47.104 "data_size": 7936 00:24:47.104 } 00:24:47.104 ] 00:24:47.104 } 00:24:47.104 } 00:24:47.104 }' 00:24:47.104 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:47.104 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:47.104 pt2' 00:24:47.104 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:47.104 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:47.104 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:47.365 "name": "pt1", 00:24:47.365 "aliases": [ 00:24:47.365 "00000000-0000-0000-0000-000000000001" 00:24:47.365 ], 00:24:47.365 "product_name": "passthru", 00:24:47.365 "block_size": 4128, 00:24:47.365 "num_blocks": 8192, 00:24:47.365 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:47.365 "md_size": 32, 00:24:47.365 "md_interleave": true, 00:24:47.365 "dif_type": 0, 00:24:47.365 "assigned_rate_limits": { 00:24:47.365 "rw_ios_per_sec": 0, 00:24:47.365 "rw_mbytes_per_sec": 0, 00:24:47.365 "r_mbytes_per_sec": 0, 00:24:47.365 "w_mbytes_per_sec": 0 00:24:47.365 }, 00:24:47.365 "claimed": true, 00:24:47.365 "claim_type": "exclusive_write", 00:24:47.365 "zoned": false, 00:24:47.365 "supported_io_types": { 00:24:47.365 "read": true, 00:24:47.365 "write": true, 00:24:47.365 "unmap": true, 00:24:47.365 "write_zeroes": true, 00:24:47.365 "flush": true, 00:24:47.365 "reset": true, 00:24:47.365 "compare": false, 00:24:47.365 "compare_and_write": false, 00:24:47.365 "abort": true, 00:24:47.365 "nvme_admin": false, 00:24:47.365 "nvme_io": false 00:24:47.365 }, 00:24:47.365 "memory_domains": [ 00:24:47.365 { 00:24:47.365 "dma_device_id": "system", 00:24:47.365 "dma_device_type": 1 00:24:47.365 }, 00:24:47.365 { 00:24:47.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:47.365 "dma_device_type": 2 00:24:47.365 } 00:24:47.365 ], 00:24:47.365 "driver_specific": { 00:24:47.365 "passthru": { 00:24:47.365 "name": "pt1", 00:24:47.365 "base_bdev_name": "malloc1" 00:24:47.365 } 00:24:47.365 } 00:24:47.365 }' 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:47.365 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:47.626 13:53:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:47.886 "name": "pt2", 00:24:47.886 "aliases": [ 00:24:47.886 "00000000-0000-0000-0000-000000000002" 00:24:47.886 ], 00:24:47.886 "product_name": "passthru", 00:24:47.886 "block_size": 4128, 00:24:47.886 "num_blocks": 8192, 00:24:47.886 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:47.886 "md_size": 32, 00:24:47.886 "md_interleave": true, 00:24:47.886 "dif_type": 0, 00:24:47.886 "assigned_rate_limits": { 00:24:47.886 "rw_ios_per_sec": 0, 00:24:47.886 "rw_mbytes_per_sec": 0, 00:24:47.886 "r_mbytes_per_sec": 0, 00:24:47.886 "w_mbytes_per_sec": 0 00:24:47.886 }, 00:24:47.886 "claimed": true, 00:24:47.886 "claim_type": "exclusive_write", 00:24:47.886 "zoned": false, 00:24:47.886 "supported_io_types": { 00:24:47.886 "read": true, 00:24:47.886 "write": true, 00:24:47.886 "unmap": true, 00:24:47.886 "write_zeroes": true, 00:24:47.886 "flush": true, 00:24:47.886 "reset": true, 00:24:47.886 "compare": false, 00:24:47.886 "compare_and_write": false, 00:24:47.886 "abort": true, 00:24:47.886 "nvme_admin": false, 00:24:47.886 "nvme_io": false 00:24:47.886 }, 00:24:47.886 "memory_domains": [ 00:24:47.886 { 00:24:47.886 "dma_device_id": "system", 00:24:47.886 "dma_device_type": 1 00:24:47.886 }, 00:24:47.886 { 00:24:47.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:47.886 "dma_device_type": 2 00:24:47.886 } 00:24:47.886 ], 00:24:47.886 "driver_specific": { 00:24:47.886 "passthru": { 00:24:47.886 "name": "pt2", 00:24:47.886 "base_bdev_name": "malloc2" 00:24:47.886 } 00:24:47.886 } 00:24:47.886 }' 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:47.886 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:48.147 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:24:48.147 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:48.147 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:48.147 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:24:48.147 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:48.147 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:24:48.408 [2024-06-10 13:53:02.634648] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' d6ccee2c-1e1f-41ca-94d6-5e0303995e62 '!=' d6ccee2c-1e1f-41ca-94d6-5e0303995e62 ']' 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:48.408 [2024-06-10 13:53:02.826971] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.408 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.669 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.669 "name": "raid_bdev1", 00:24:48.669 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:48.669 "strip_size_kb": 0, 00:24:48.669 "state": "online", 00:24:48.669 "raid_level": "raid1", 00:24:48.669 "superblock": true, 00:24:48.669 "num_base_bdevs": 2, 00:24:48.669 "num_base_bdevs_discovered": 1, 00:24:48.669 "num_base_bdevs_operational": 1, 00:24:48.669 "base_bdevs_list": [ 00:24:48.669 { 00:24:48.669 "name": null, 00:24:48.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.669 "is_configured": false, 00:24:48.669 "data_offset": 256, 00:24:48.669 "data_size": 7936 00:24:48.669 }, 00:24:48.669 { 00:24:48.669 "name": "pt2", 00:24:48.669 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:48.669 "is_configured": true, 00:24:48.669 "data_offset": 256, 00:24:48.669 "data_size": 7936 00:24:48.669 } 00:24:48.669 ] 00:24:48.669 }' 00:24:48.669 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.669 13:53:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:49.239 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:49.239 [2024-06-10 13:53:03.685142] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:49.239 [2024-06-10 13:53:03.685158] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:49.239 [2024-06-10 13:53:03.685197] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:49.239 [2024-06-10 13:53:03.685228] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:49.239 [2024-06-10 13:53:03.685235] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10cabd0 name raid_bdev1, state offline 00:24:49.239 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.239 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:24:49.498 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:24:49.498 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:24:49.498 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:24:49.499 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:49.499 13:53:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:49.758 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:49.758 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:49.758 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:24:49.758 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:49.758 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:24:49.758 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:50.018 [2024-06-10 13:53:04.294661] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:50.018 [2024-06-10 13:53:04.294689] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:50.018 [2024-06-10 13:53:04.294701] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1259e90 00:24:50.018 [2024-06-10 13:53:04.294707] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:50.018 [2024-06-10 13:53:04.295894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:50.018 [2024-06-10 13:53:04.295912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:50.018 [2024-06-10 13:53:04.295944] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:50.018 [2024-06-10 13:53:04.295960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:50.018 [2024-06-10 13:53:04.296010] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x125d030 00:24:50.018 [2024-06-10 13:53:04.296016] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:50.018 [2024-06-10 13:53:04.296057] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10c1770 00:24:50.018 [2024-06-10 13:53:04.296115] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x125d030 00:24:50.018 [2024-06-10 13:53:04.296120] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x125d030 00:24:50.018 [2024-06-10 13:53:04.296169] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:50.018 pt2 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.018 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.277 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.277 "name": "raid_bdev1", 00:24:50.277 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:50.277 "strip_size_kb": 0, 00:24:50.277 "state": "online", 00:24:50.277 "raid_level": "raid1", 00:24:50.277 "superblock": true, 00:24:50.277 "num_base_bdevs": 2, 00:24:50.277 "num_base_bdevs_discovered": 1, 00:24:50.277 "num_base_bdevs_operational": 1, 00:24:50.277 "base_bdevs_list": [ 00:24:50.277 { 00:24:50.277 "name": null, 00:24:50.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.277 "is_configured": false, 00:24:50.277 "data_offset": 256, 00:24:50.277 "data_size": 7936 00:24:50.277 }, 00:24:50.277 { 00:24:50.277 "name": "pt2", 00:24:50.277 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:50.277 "is_configured": true, 00:24:50.277 "data_offset": 256, 00:24:50.277 "data_size": 7936 00:24:50.277 } 00:24:50.277 ] 00:24:50.277 }' 00:24:50.277 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.277 13:53:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:50.846 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:50.846 [2024-06-10 13:53:05.261114] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:50.846 [2024-06-10 13:53:05.261127] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:50.846 [2024-06-10 13:53:05.261157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:50.846 [2024-06-10 13:53:05.261191] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:50.846 [2024-06-10 13:53:05.261197] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125d030 name raid_bdev1, state offline 00:24:50.846 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.846 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:51.105 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:51.105 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:51.105 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:24:51.105 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:51.365 [2024-06-10 13:53:05.666126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:51.365 [2024-06-10 13:53:05.666150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.365 [2024-06-10 13:53:05.666159] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125b960 00:24:51.365 [2024-06-10 13:53:05.666170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.365 [2024-06-10 13:53:05.667359] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.365 [2024-06-10 13:53:05.667376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:51.365 [2024-06-10 13:53:05.667407] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:51.365 [2024-06-10 13:53:05.667423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:51.365 [2024-06-10 13:53:05.667485] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:51.365 [2024-06-10 13:53:05.667492] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:51.365 [2024-06-10 13:53:05.667500] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125d910 name raid_bdev1, state configuring 00:24:51.365 [2024-06-10 13:53:05.667514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:51.365 [2024-06-10 13:53:05.667552] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x125d2b0 00:24:51.365 [2024-06-10 13:53:05.667558] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:51.365 [2024-06-10 13:53:05.667602] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1259da0 00:24:51.365 [2024-06-10 13:53:05.667660] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x125d2b0 00:24:51.365 [2024-06-10 13:53:05.667665] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x125d2b0 00:24:51.365 [2024-06-10 13:53:05.667714] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.365 pt1 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.365 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.625 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.625 "name": "raid_bdev1", 00:24:51.625 "uuid": "d6ccee2c-1e1f-41ca-94d6-5e0303995e62", 00:24:51.625 "strip_size_kb": 0, 00:24:51.625 "state": "online", 00:24:51.625 "raid_level": "raid1", 00:24:51.625 "superblock": true, 00:24:51.625 "num_base_bdevs": 2, 00:24:51.625 "num_base_bdevs_discovered": 1, 00:24:51.625 "num_base_bdevs_operational": 1, 00:24:51.625 "base_bdevs_list": [ 00:24:51.625 { 00:24:51.625 "name": null, 00:24:51.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.625 "is_configured": false, 00:24:51.625 "data_offset": 256, 00:24:51.625 "data_size": 7936 00:24:51.625 }, 00:24:51.625 { 00:24:51.625 "name": "pt2", 00:24:51.625 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:51.625 "is_configured": true, 00:24:51.625 "data_offset": 256, 00:24:51.625 "data_size": 7936 00:24:51.625 } 00:24:51.625 ] 00:24:51.625 }' 00:24:51.625 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.625 13:53:05 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:52.195 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:52.195 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:52.195 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:52.196 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:52.196 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:52.456 [2024-06-10 13:53:06.817210] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' d6ccee2c-1e1f-41ca-94d6-5e0303995e62 '!=' d6ccee2c-1e1f-41ca-94d6-5e0303995e62 ']' 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1687001 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 1687001 ']' 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 1687001 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1687001 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1687001' 00:24:52.456 killing process with pid 1687001 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # kill 1687001 00:24:52.456 [2024-06-10 13:53:06.885426] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:52.456 [2024-06-10 13:53:06.885465] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:52.456 [2024-06-10 13:53:06.885496] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:52.456 [2024-06-10 13:53:06.885501] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125d2b0 name raid_bdev1, state offline 00:24:52.456 13:53:06 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@973 -- # wait 1687001 00:24:52.456 [2024-06-10 13:53:06.895346] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:52.716 13:53:07 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:24:52.716 00:24:52.716 real 0m13.049s 00:24:52.716 user 0m24.220s 00:24:52.716 sys 0m1.911s 00:24:52.716 13:53:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:52.716 13:53:07 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:52.716 ************************************ 00:24:52.716 END TEST raid_superblock_test_md_interleaved 00:24:52.716 ************************************ 00:24:52.716 13:53:07 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:24:52.716 13:53:07 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:24:52.716 13:53:07 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:52.716 13:53:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:52.716 ************************************ 00:24:52.716 START TEST raid_rebuild_test_sb_md_interleaved 00:24:52.716 ************************************ 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false false 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1689947 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1689947 /var/tmp/spdk-raid.sock 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 1689947 ']' 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:52.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:52.716 13:53:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:52.716 [2024-06-10 13:53:07.169432] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:24:52.716 [2024-06-10 13:53:07.169481] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1689947 ] 00:24:52.716 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:52.716 Zero copy mechanism will not be used. 00:24:52.976 [2024-06-10 13:53:07.256017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:52.976 [2024-06-10 13:53:07.320702] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:24:52.976 [2024-06-10 13:53:07.363393] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:52.976 [2024-06-10 13:53:07.363417] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:53.915 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:53.915 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:24:53.915 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:53.916 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:24:53.916 BaseBdev1_malloc 00:24:53.916 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:54.176 [2024-06-10 13:53:08.418541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:54.176 [2024-06-10 13:53:08.418575] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.176 [2024-06-10 13:53:08.418592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa5f8e0 00:24:54.176 [2024-06-10 13:53:08.418599] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.176 [2024-06-10 13:53:08.419841] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.176 [2024-06-10 13:53:08.419864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:54.176 BaseBdev1 00:24:54.176 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:54.176 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:24:54.176 BaseBdev2_malloc 00:24:54.176 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:54.436 [2024-06-10 13:53:08.809849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:54.436 [2024-06-10 13:53:08.809876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.436 [2024-06-10 13:53:08.809888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa446e0 00:24:54.436 [2024-06-10 13:53:08.809895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.436 [2024-06-10 13:53:08.811055] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.436 [2024-06-10 13:53:08.811072] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:54.436 BaseBdev2 00:24:54.436 13:53:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:24:54.695 spare_malloc 00:24:54.695 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:54.953 spare_delay 00:24:54.953 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:54.953 [2024-06-10 13:53:09.413548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:54.953 [2024-06-10 13:53:09.413576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.953 [2024-06-10 13:53:09.413591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa451b0 00:24:54.953 [2024-06-10 13:53:09.413598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.953 [2024-06-10 13:53:09.414718] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.953 [2024-06-10 13:53:09.414736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:54.953 spare 00:24:54.953 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:55.212 [2024-06-10 13:53:09.614072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:55.213 [2024-06-10 13:53:09.615124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:55.213 [2024-06-10 13:53:09.615259] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa4f760 00:24:55.213 [2024-06-10 13:53:09.615268] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:55.213 [2024-06-10 13:53:09.615316] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8ba510 00:24:55.213 [2024-06-10 13:53:09.615380] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa4f760 00:24:55.213 [2024-06-10 13:53:09.615386] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa4f760 00:24:55.213 [2024-06-10 13:53:09.615427] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.213 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.473 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.473 "name": "raid_bdev1", 00:24:55.473 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:24:55.473 "strip_size_kb": 0, 00:24:55.473 "state": "online", 00:24:55.473 "raid_level": "raid1", 00:24:55.473 "superblock": true, 00:24:55.473 "num_base_bdevs": 2, 00:24:55.473 "num_base_bdevs_discovered": 2, 00:24:55.473 "num_base_bdevs_operational": 2, 00:24:55.473 "base_bdevs_list": [ 00:24:55.473 { 00:24:55.473 "name": "BaseBdev1", 00:24:55.473 "uuid": "746560df-5040-57f5-ac88-19c598bad060", 00:24:55.473 "is_configured": true, 00:24:55.473 "data_offset": 256, 00:24:55.473 "data_size": 7936 00:24:55.473 }, 00:24:55.474 { 00:24:55.474 "name": "BaseBdev2", 00:24:55.474 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:24:55.474 "is_configured": true, 00:24:55.474 "data_offset": 256, 00:24:55.474 "data_size": 7936 00:24:55.474 } 00:24:55.474 ] 00:24:55.474 }' 00:24:55.474 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.474 13:53:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:56.044 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:56.044 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:56.303 [2024-06-10 13:53:10.560711] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:56.303 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:24:56.303 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.303 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:56.564 [2024-06-10 13:53:10.969536] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.564 13:53:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.824 13:53:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.824 "name": "raid_bdev1", 00:24:56.824 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:24:56.824 "strip_size_kb": 0, 00:24:56.824 "state": "online", 00:24:56.824 "raid_level": "raid1", 00:24:56.824 "superblock": true, 00:24:56.824 "num_base_bdevs": 2, 00:24:56.824 "num_base_bdevs_discovered": 1, 00:24:56.824 "num_base_bdevs_operational": 1, 00:24:56.824 "base_bdevs_list": [ 00:24:56.824 { 00:24:56.824 "name": null, 00:24:56.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.824 "is_configured": false, 00:24:56.824 "data_offset": 256, 00:24:56.824 "data_size": 7936 00:24:56.824 }, 00:24:56.824 { 00:24:56.824 "name": "BaseBdev2", 00:24:56.824 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:24:56.824 "is_configured": true, 00:24:56.824 "data_offset": 256, 00:24:56.824 "data_size": 7936 00:24:56.824 } 00:24:56.824 ] 00:24:56.824 }' 00:24:56.824 13:53:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.824 13:53:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:57.395 13:53:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:57.656 [2024-06-10 13:53:11.903918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.656 [2024-06-10 13:53:11.906505] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa50f60 00:24:57.656 [2024-06-10 13:53:11.908227] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:57.656 13:53:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.595 13:53:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.855 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.855 "name": "raid_bdev1", 00:24:58.855 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:24:58.855 "strip_size_kb": 0, 00:24:58.855 "state": "online", 00:24:58.855 "raid_level": "raid1", 00:24:58.855 "superblock": true, 00:24:58.855 "num_base_bdevs": 2, 00:24:58.855 "num_base_bdevs_discovered": 2, 00:24:58.855 "num_base_bdevs_operational": 2, 00:24:58.855 "process": { 00:24:58.855 "type": "rebuild", 00:24:58.855 "target": "spare", 00:24:58.855 "progress": { 00:24:58.855 "blocks": 3072, 00:24:58.855 "percent": 38 00:24:58.855 } 00:24:58.855 }, 00:24:58.855 "base_bdevs_list": [ 00:24:58.855 { 00:24:58.855 "name": "spare", 00:24:58.855 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:24:58.855 "is_configured": true, 00:24:58.855 "data_offset": 256, 00:24:58.855 "data_size": 7936 00:24:58.855 }, 00:24:58.855 { 00:24:58.855 "name": "BaseBdev2", 00:24:58.855 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:24:58.855 "is_configured": true, 00:24:58.855 "data_offset": 256, 00:24:58.855 "data_size": 7936 00:24:58.855 } 00:24:58.855 ] 00:24:58.855 }' 00:24:58.855 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.855 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:58.855 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.855 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:58.855 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:59.115 [2024-06-10 13:53:13.412640] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:59.115 [2024-06-10 13:53:13.417438] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:59.115 [2024-06-10 13:53:13.417471] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:59.115 [2024-06-10 13:53:13.417481] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:59.115 [2024-06-10 13:53:13.417486] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.115 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.375 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.375 "name": "raid_bdev1", 00:24:59.375 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:24:59.375 "strip_size_kb": 0, 00:24:59.375 "state": "online", 00:24:59.375 "raid_level": "raid1", 00:24:59.375 "superblock": true, 00:24:59.375 "num_base_bdevs": 2, 00:24:59.375 "num_base_bdevs_discovered": 1, 00:24:59.375 "num_base_bdevs_operational": 1, 00:24:59.375 "base_bdevs_list": [ 00:24:59.375 { 00:24:59.375 "name": null, 00:24:59.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.375 "is_configured": false, 00:24:59.375 "data_offset": 256, 00:24:59.375 "data_size": 7936 00:24:59.375 }, 00:24:59.375 { 00:24:59.375 "name": "BaseBdev2", 00:24:59.375 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:24:59.375 "is_configured": true, 00:24:59.375 "data_offset": 256, 00:24:59.375 "data_size": 7936 00:24:59.375 } 00:24:59.375 ] 00:24:59.375 }' 00:24:59.375 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.375 13:53:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.944 "name": "raid_bdev1", 00:24:59.944 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:24:59.944 "strip_size_kb": 0, 00:24:59.944 "state": "online", 00:24:59.944 "raid_level": "raid1", 00:24:59.944 "superblock": true, 00:24:59.944 "num_base_bdevs": 2, 00:24:59.944 "num_base_bdevs_discovered": 1, 00:24:59.944 "num_base_bdevs_operational": 1, 00:24:59.944 "base_bdevs_list": [ 00:24:59.944 { 00:24:59.944 "name": null, 00:24:59.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.944 "is_configured": false, 00:24:59.944 "data_offset": 256, 00:24:59.944 "data_size": 7936 00:24:59.944 }, 00:24:59.944 { 00:24:59.944 "name": "BaseBdev2", 00:24:59.944 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:24:59.944 "is_configured": true, 00:24:59.944 "data_offset": 256, 00:24:59.944 "data_size": 7936 00:24:59.944 } 00:24:59.944 ] 00:24:59.944 }' 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:59.944 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:00.204 [2024-06-10 13:53:14.579847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:00.204 [2024-06-10 13:53:14.582340] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa609e0 00:25:00.204 [2024-06-10 13:53:14.583577] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:00.204 13:53:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.143 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.403 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.403 "name": "raid_bdev1", 00:25:01.403 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:01.403 "strip_size_kb": 0, 00:25:01.403 "state": "online", 00:25:01.403 "raid_level": "raid1", 00:25:01.403 "superblock": true, 00:25:01.403 "num_base_bdevs": 2, 00:25:01.403 "num_base_bdevs_discovered": 2, 00:25:01.403 "num_base_bdevs_operational": 2, 00:25:01.403 "process": { 00:25:01.403 "type": "rebuild", 00:25:01.403 "target": "spare", 00:25:01.403 "progress": { 00:25:01.403 "blocks": 2816, 00:25:01.403 "percent": 35 00:25:01.403 } 00:25:01.403 }, 00:25:01.403 "base_bdevs_list": [ 00:25:01.403 { 00:25:01.403 "name": "spare", 00:25:01.403 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:01.403 "is_configured": true, 00:25:01.403 "data_offset": 256, 00:25:01.403 "data_size": 7936 00:25:01.403 }, 00:25:01.403 { 00:25:01.403 "name": "BaseBdev2", 00:25:01.403 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:01.403 "is_configured": true, 00:25:01.403 "data_offset": 256, 00:25:01.403 "data_size": 7936 00:25:01.403 } 00:25:01.403 ] 00:25:01.403 }' 00:25:01.403 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.403 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.403 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:01.663 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=983 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.663 13:53:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.663 13:53:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.663 "name": "raid_bdev1", 00:25:01.663 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:01.663 "strip_size_kb": 0, 00:25:01.663 "state": "online", 00:25:01.663 "raid_level": "raid1", 00:25:01.663 "superblock": true, 00:25:01.663 "num_base_bdevs": 2, 00:25:01.663 "num_base_bdevs_discovered": 2, 00:25:01.663 "num_base_bdevs_operational": 2, 00:25:01.663 "process": { 00:25:01.663 "type": "rebuild", 00:25:01.663 "target": "spare", 00:25:01.663 "progress": { 00:25:01.663 "blocks": 3584, 00:25:01.663 "percent": 45 00:25:01.663 } 00:25:01.663 }, 00:25:01.663 "base_bdevs_list": [ 00:25:01.663 { 00:25:01.663 "name": "spare", 00:25:01.663 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:01.663 "is_configured": true, 00:25:01.663 "data_offset": 256, 00:25:01.663 "data_size": 7936 00:25:01.663 }, 00:25:01.663 { 00:25:01.663 "name": "BaseBdev2", 00:25:01.663 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:01.663 "is_configured": true, 00:25:01.663 "data_offset": 256, 00:25:01.663 "data_size": 7936 00:25:01.663 } 00:25:01.663 ] 00:25:01.663 }' 00:25:01.663 13:53:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.923 13:53:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.923 13:53:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.923 13:53:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.923 13:53:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.862 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.122 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.122 "name": "raid_bdev1", 00:25:03.122 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:03.122 "strip_size_kb": 0, 00:25:03.122 "state": "online", 00:25:03.122 "raid_level": "raid1", 00:25:03.122 "superblock": true, 00:25:03.122 "num_base_bdevs": 2, 00:25:03.122 "num_base_bdevs_discovered": 2, 00:25:03.122 "num_base_bdevs_operational": 2, 00:25:03.122 "process": { 00:25:03.122 "type": "rebuild", 00:25:03.122 "target": "spare", 00:25:03.122 "progress": { 00:25:03.122 "blocks": 6912, 00:25:03.122 "percent": 87 00:25:03.122 } 00:25:03.122 }, 00:25:03.122 "base_bdevs_list": [ 00:25:03.122 { 00:25:03.122 "name": "spare", 00:25:03.122 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:03.122 "is_configured": true, 00:25:03.122 "data_offset": 256, 00:25:03.122 "data_size": 7936 00:25:03.122 }, 00:25:03.122 { 00:25:03.122 "name": "BaseBdev2", 00:25:03.122 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:03.122 "is_configured": true, 00:25:03.122 "data_offset": 256, 00:25:03.122 "data_size": 7936 00:25:03.122 } 00:25:03.122 ] 00:25:03.122 }' 00:25:03.122 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.122 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:03.122 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.122 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:03.122 13:53:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:03.382 [2024-06-10 13:53:17.701923] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:03.382 [2024-06-10 13:53:17.701969] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:03.382 [2024-06-10 13:53:17.702033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.406 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.406 "name": "raid_bdev1", 00:25:04.406 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:04.406 "strip_size_kb": 0, 00:25:04.406 "state": "online", 00:25:04.406 "raid_level": "raid1", 00:25:04.406 "superblock": true, 00:25:04.406 "num_base_bdevs": 2, 00:25:04.406 "num_base_bdevs_discovered": 2, 00:25:04.406 "num_base_bdevs_operational": 2, 00:25:04.406 "base_bdevs_list": [ 00:25:04.406 { 00:25:04.406 "name": "spare", 00:25:04.406 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:04.406 "is_configured": true, 00:25:04.406 "data_offset": 256, 00:25:04.406 "data_size": 7936 00:25:04.406 }, 00:25:04.406 { 00:25:04.406 "name": "BaseBdev2", 00:25:04.407 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:04.407 "is_configured": true, 00:25:04.407 "data_offset": 256, 00:25:04.407 "data_size": 7936 00:25:04.407 } 00:25:04.407 ] 00:25:04.407 }' 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.407 13:53:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.666 "name": "raid_bdev1", 00:25:04.666 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:04.666 "strip_size_kb": 0, 00:25:04.666 "state": "online", 00:25:04.666 "raid_level": "raid1", 00:25:04.666 "superblock": true, 00:25:04.666 "num_base_bdevs": 2, 00:25:04.666 "num_base_bdevs_discovered": 2, 00:25:04.666 "num_base_bdevs_operational": 2, 00:25:04.666 "base_bdevs_list": [ 00:25:04.666 { 00:25:04.666 "name": "spare", 00:25:04.666 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:04.666 "is_configured": true, 00:25:04.666 "data_offset": 256, 00:25:04.666 "data_size": 7936 00:25:04.666 }, 00:25:04.666 { 00:25:04.666 "name": "BaseBdev2", 00:25:04.666 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:04.666 "is_configured": true, 00:25:04.666 "data_offset": 256, 00:25:04.666 "data_size": 7936 00:25:04.666 } 00:25:04.666 ] 00:25:04.666 }' 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.666 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.926 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.926 "name": "raid_bdev1", 00:25:04.926 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:04.926 "strip_size_kb": 0, 00:25:04.926 "state": "online", 00:25:04.926 "raid_level": "raid1", 00:25:04.926 "superblock": true, 00:25:04.926 "num_base_bdevs": 2, 00:25:04.926 "num_base_bdevs_discovered": 2, 00:25:04.926 "num_base_bdevs_operational": 2, 00:25:04.926 "base_bdevs_list": [ 00:25:04.926 { 00:25:04.926 "name": "spare", 00:25:04.926 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:04.926 "is_configured": true, 00:25:04.926 "data_offset": 256, 00:25:04.926 "data_size": 7936 00:25:04.926 }, 00:25:04.926 { 00:25:04.926 "name": "BaseBdev2", 00:25:04.926 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:04.926 "is_configured": true, 00:25:04.926 "data_offset": 256, 00:25:04.926 "data_size": 7936 00:25:04.926 } 00:25:04.926 ] 00:25:04.926 }' 00:25:04.926 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.926 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:05.496 13:53:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:05.756 [2024-06-10 13:53:20.024092] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:05.756 [2024-06-10 13:53:20.024111] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:05.756 [2024-06-10 13:53:20.024157] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:05.756 [2024-06-10 13:53:20.024207] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:05.756 [2024-06-10 13:53:20.024213] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa4f760 name raid_bdev1, state offline 00:25:05.756 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.756 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:25:06.016 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:06.016 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:25:06.016 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:06.016 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:06.016 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:06.277 [2024-06-10 13:53:20.617556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:06.277 [2024-06-10 13:53:20.617580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:06.277 [2024-06-10 13:53:20.617594] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8bc320 00:25:06.277 [2024-06-10 13:53:20.617601] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:06.277 [2024-06-10 13:53:20.618825] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:06.277 [2024-06-10 13:53:20.618844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:06.277 [2024-06-10 13:53:20.618886] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:06.277 [2024-06-10 13:53:20.618905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:06.277 [2024-06-10 13:53:20.618973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:06.277 spare 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.277 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.277 [2024-06-10 13:53:20.719259] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa51950 00:25:06.277 [2024-06-10 13:53:20.719267] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:25:06.277 [2024-06-10 13:53:20.719320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa60340 00:25:06.277 [2024-06-10 13:53:20.719387] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa51950 00:25:06.277 [2024-06-10 13:53:20.719392] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa51950 00:25:06.277 [2024-06-10 13:53:20.719440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.537 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:06.537 "name": "raid_bdev1", 00:25:06.537 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:06.537 "strip_size_kb": 0, 00:25:06.537 "state": "online", 00:25:06.537 "raid_level": "raid1", 00:25:06.537 "superblock": true, 00:25:06.537 "num_base_bdevs": 2, 00:25:06.537 "num_base_bdevs_discovered": 2, 00:25:06.537 "num_base_bdevs_operational": 2, 00:25:06.537 "base_bdevs_list": [ 00:25:06.537 { 00:25:06.537 "name": "spare", 00:25:06.537 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:06.537 "is_configured": true, 00:25:06.537 "data_offset": 256, 00:25:06.537 "data_size": 7936 00:25:06.537 }, 00:25:06.537 { 00:25:06.537 "name": "BaseBdev2", 00:25:06.537 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:06.537 "is_configured": true, 00:25:06.537 "data_offset": 256, 00:25:06.537 "data_size": 7936 00:25:06.537 } 00:25:06.537 ] 00:25:06.537 }' 00:25:06.537 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:06.537 13:53:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.107 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.367 "name": "raid_bdev1", 00:25:07.367 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:07.367 "strip_size_kb": 0, 00:25:07.367 "state": "online", 00:25:07.367 "raid_level": "raid1", 00:25:07.367 "superblock": true, 00:25:07.367 "num_base_bdevs": 2, 00:25:07.367 "num_base_bdevs_discovered": 2, 00:25:07.367 "num_base_bdevs_operational": 2, 00:25:07.367 "base_bdevs_list": [ 00:25:07.367 { 00:25:07.367 "name": "spare", 00:25:07.367 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:07.367 "is_configured": true, 00:25:07.367 "data_offset": 256, 00:25:07.367 "data_size": 7936 00:25:07.367 }, 00:25:07.367 { 00:25:07.367 "name": "BaseBdev2", 00:25:07.367 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:07.367 "is_configured": true, 00:25:07.367 "data_offset": 256, 00:25:07.367 "data_size": 7936 00:25:07.367 } 00:25:07.367 ] 00:25:07.367 }' 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.367 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:07.627 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:07.627 13:53:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:07.627 [2024-06-10 13:53:22.097398] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.886 "name": "raid_bdev1", 00:25:07.886 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:07.886 "strip_size_kb": 0, 00:25:07.886 "state": "online", 00:25:07.886 "raid_level": "raid1", 00:25:07.886 "superblock": true, 00:25:07.886 "num_base_bdevs": 2, 00:25:07.886 "num_base_bdevs_discovered": 1, 00:25:07.886 "num_base_bdevs_operational": 1, 00:25:07.886 "base_bdevs_list": [ 00:25:07.886 { 00:25:07.886 "name": null, 00:25:07.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.886 "is_configured": false, 00:25:07.886 "data_offset": 256, 00:25:07.886 "data_size": 7936 00:25:07.886 }, 00:25:07.886 { 00:25:07.886 "name": "BaseBdev2", 00:25:07.886 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:07.886 "is_configured": true, 00:25:07.886 "data_offset": 256, 00:25:07.886 "data_size": 7936 00:25:07.886 } 00:25:07.886 ] 00:25:07.886 }' 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.886 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:08.455 13:53:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.715 [2024-06-10 13:53:23.043812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.715 [2024-06-10 13:53:23.043931] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:08.715 [2024-06-10 13:53:23.043941] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:08.715 [2024-06-10 13:53:23.043957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.715 [2024-06-10 13:53:23.046418] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa60340 00:25:08.715 [2024-06-10 13:53:23.047602] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.715 13:53:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:09.654 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.654 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.654 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.654 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.654 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.654 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.655 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.915 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.915 "name": "raid_bdev1", 00:25:09.915 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:09.915 "strip_size_kb": 0, 00:25:09.915 "state": "online", 00:25:09.915 "raid_level": "raid1", 00:25:09.915 "superblock": true, 00:25:09.915 "num_base_bdevs": 2, 00:25:09.915 "num_base_bdevs_discovered": 2, 00:25:09.915 "num_base_bdevs_operational": 2, 00:25:09.915 "process": { 00:25:09.915 "type": "rebuild", 00:25:09.915 "target": "spare", 00:25:09.915 "progress": { 00:25:09.915 "blocks": 2816, 00:25:09.915 "percent": 35 00:25:09.915 } 00:25:09.915 }, 00:25:09.915 "base_bdevs_list": [ 00:25:09.915 { 00:25:09.915 "name": "spare", 00:25:09.915 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:09.915 "is_configured": true, 00:25:09.915 "data_offset": 256, 00:25:09.915 "data_size": 7936 00:25:09.915 }, 00:25:09.915 { 00:25:09.915 "name": "BaseBdev2", 00:25:09.915 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:09.915 "is_configured": true, 00:25:09.915 "data_offset": 256, 00:25:09.915 "data_size": 7936 00:25:09.915 } 00:25:09.915 ] 00:25:09.915 }' 00:25:09.915 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.915 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.915 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.915 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.916 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:10.175 [2024-06-10 13:53:24.512538] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.176 [2024-06-10 13:53:24.556612] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:10.176 [2024-06-10 13:53:24.556640] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.176 [2024-06-10 13:53:24.556650] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.176 [2024-06-10 13:53:24.556655] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.176 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.435 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.435 "name": "raid_bdev1", 00:25:10.435 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:10.435 "strip_size_kb": 0, 00:25:10.435 "state": "online", 00:25:10.435 "raid_level": "raid1", 00:25:10.435 "superblock": true, 00:25:10.435 "num_base_bdevs": 2, 00:25:10.435 "num_base_bdevs_discovered": 1, 00:25:10.435 "num_base_bdevs_operational": 1, 00:25:10.435 "base_bdevs_list": [ 00:25:10.435 { 00:25:10.435 "name": null, 00:25:10.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.435 "is_configured": false, 00:25:10.435 "data_offset": 256, 00:25:10.435 "data_size": 7936 00:25:10.435 }, 00:25:10.435 { 00:25:10.435 "name": "BaseBdev2", 00:25:10.435 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:10.435 "is_configured": true, 00:25:10.435 "data_offset": 256, 00:25:10.435 "data_size": 7936 00:25:10.435 } 00:25:10.435 ] 00:25:10.435 }' 00:25:10.435 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.435 13:53:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:11.005 13:53:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:11.265 [2024-06-10 13:53:25.507033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:11.265 [2024-06-10 13:53:25.507067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.265 [2024-06-10 13:53:25.507083] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8ba1f0 00:25:11.265 [2024-06-10 13:53:25.507090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.265 [2024-06-10 13:53:25.507252] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.265 [2024-06-10 13:53:25.507263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:11.265 [2024-06-10 13:53:25.507305] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:11.265 [2024-06-10 13:53:25.507312] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:11.265 [2024-06-10 13:53:25.507318] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:11.265 [2024-06-10 13:53:25.507329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:11.265 [2024-06-10 13:53:25.509752] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8bccc0 00:25:11.265 [2024-06-10 13:53:25.510913] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:11.265 spare 00:25:11.265 13:53:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:12.205 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.205 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.205 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.205 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.205 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.206 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.206 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.465 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.465 "name": "raid_bdev1", 00:25:12.465 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:12.465 "strip_size_kb": 0, 00:25:12.465 "state": "online", 00:25:12.465 "raid_level": "raid1", 00:25:12.465 "superblock": true, 00:25:12.465 "num_base_bdevs": 2, 00:25:12.465 "num_base_bdevs_discovered": 2, 00:25:12.465 "num_base_bdevs_operational": 2, 00:25:12.465 "process": { 00:25:12.465 "type": "rebuild", 00:25:12.465 "target": "spare", 00:25:12.465 "progress": { 00:25:12.465 "blocks": 2816, 00:25:12.465 "percent": 35 00:25:12.465 } 00:25:12.465 }, 00:25:12.465 "base_bdevs_list": [ 00:25:12.465 { 00:25:12.465 "name": "spare", 00:25:12.465 "uuid": "a0bea8fc-8db6-5938-8acb-c773e90d638f", 00:25:12.465 "is_configured": true, 00:25:12.465 "data_offset": 256, 00:25:12.465 "data_size": 7936 00:25:12.465 }, 00:25:12.465 { 00:25:12.465 "name": "BaseBdev2", 00:25:12.465 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:12.465 "is_configured": true, 00:25:12.465 "data_offset": 256, 00:25:12.465 "data_size": 7936 00:25:12.465 } 00:25:12.465 ] 00:25:12.465 }' 00:25:12.465 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.465 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.466 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.466 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:12.466 13:53:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:12.726 [2024-06-10 13:53:26.999700] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:12.726 [2024-06-10 13:53:27.020026] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:12.726 [2024-06-10 13:53:27.020056] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.726 [2024-06-10 13:53:27.020065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:12.726 [2024-06-10 13:53:27.020070] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.726 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.985 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.985 "name": "raid_bdev1", 00:25:12.985 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:12.985 "strip_size_kb": 0, 00:25:12.985 "state": "online", 00:25:12.985 "raid_level": "raid1", 00:25:12.985 "superblock": true, 00:25:12.985 "num_base_bdevs": 2, 00:25:12.985 "num_base_bdevs_discovered": 1, 00:25:12.985 "num_base_bdevs_operational": 1, 00:25:12.985 "base_bdevs_list": [ 00:25:12.985 { 00:25:12.985 "name": null, 00:25:12.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.985 "is_configured": false, 00:25:12.985 "data_offset": 256, 00:25:12.985 "data_size": 7936 00:25:12.985 }, 00:25:12.985 { 00:25:12.985 "name": "BaseBdev2", 00:25:12.985 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:12.985 "is_configured": true, 00:25:12.985 "data_offset": 256, 00:25:12.985 "data_size": 7936 00:25:12.985 } 00:25:12.985 ] 00:25:12.985 }' 00:25:12.985 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.985 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.556 "name": "raid_bdev1", 00:25:13.556 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:13.556 "strip_size_kb": 0, 00:25:13.556 "state": "online", 00:25:13.556 "raid_level": "raid1", 00:25:13.556 "superblock": true, 00:25:13.556 "num_base_bdevs": 2, 00:25:13.556 "num_base_bdevs_discovered": 1, 00:25:13.556 "num_base_bdevs_operational": 1, 00:25:13.556 "base_bdevs_list": [ 00:25:13.556 { 00:25:13.556 "name": null, 00:25:13.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.556 "is_configured": false, 00:25:13.556 "data_offset": 256, 00:25:13.556 "data_size": 7936 00:25:13.556 }, 00:25:13.556 { 00:25:13.556 "name": "BaseBdev2", 00:25:13.556 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:13.556 "is_configured": true, 00:25:13.556 "data_offset": 256, 00:25:13.556 "data_size": 7936 00:25:13.556 } 00:25:13.556 ] 00:25:13.556 }' 00:25:13.556 13:53:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.817 13:53:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.817 13:53:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.817 13:53:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.817 13:53:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:13.817 13:53:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:14.077 [2024-06-10 13:53:28.475519] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:14.077 [2024-06-10 13:53:28.475548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.077 [2024-06-10 13:53:28.475568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8bff80 00:25:14.077 [2024-06-10 13:53:28.475575] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.077 [2024-06-10 13:53:28.475707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.077 [2024-06-10 13:53:28.475717] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:14.077 [2024-06-10 13:53:28.475749] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:14.077 [2024-06-10 13:53:28.475756] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:14.077 [2024-06-10 13:53:28.475761] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:14.077 BaseBdev1 00:25:14.077 13:53:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.459 "name": "raid_bdev1", 00:25:15.459 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:15.459 "strip_size_kb": 0, 00:25:15.459 "state": "online", 00:25:15.459 "raid_level": "raid1", 00:25:15.459 "superblock": true, 00:25:15.459 "num_base_bdevs": 2, 00:25:15.459 "num_base_bdevs_discovered": 1, 00:25:15.459 "num_base_bdevs_operational": 1, 00:25:15.459 "base_bdevs_list": [ 00:25:15.459 { 00:25:15.459 "name": null, 00:25:15.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.459 "is_configured": false, 00:25:15.459 "data_offset": 256, 00:25:15.459 "data_size": 7936 00:25:15.459 }, 00:25:15.459 { 00:25:15.459 "name": "BaseBdev2", 00:25:15.459 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:15.459 "is_configured": true, 00:25:15.459 "data_offset": 256, 00:25:15.459 "data_size": 7936 00:25:15.459 } 00:25:15.459 ] 00:25:15.459 }' 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.459 13:53:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:16.029 "name": "raid_bdev1", 00:25:16.029 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:16.029 "strip_size_kb": 0, 00:25:16.029 "state": "online", 00:25:16.029 "raid_level": "raid1", 00:25:16.029 "superblock": true, 00:25:16.029 "num_base_bdevs": 2, 00:25:16.029 "num_base_bdevs_discovered": 1, 00:25:16.029 "num_base_bdevs_operational": 1, 00:25:16.029 "base_bdevs_list": [ 00:25:16.029 { 00:25:16.029 "name": null, 00:25:16.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.029 "is_configured": false, 00:25:16.029 "data_offset": 256, 00:25:16.029 "data_size": 7936 00:25:16.029 }, 00:25:16.029 { 00:25:16.029 "name": "BaseBdev2", 00:25:16.029 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:16.029 "is_configured": true, 00:25:16.029 "data_offset": 256, 00:25:16.029 "data_size": 7936 00:25:16.029 } 00:25:16.029 ] 00:25:16.029 }' 00:25:16.029 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:16.290 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:16.290 [2024-06-10 13:53:30.761364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:16.290 [2024-06-10 13:53:30.761462] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:16.290 [2024-06-10 13:53:30.761470] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:16.551 request: 00:25:16.551 { 00:25:16.552 "raid_bdev": "raid_bdev1", 00:25:16.552 "base_bdev": "BaseBdev1", 00:25:16.552 "method": "bdev_raid_add_base_bdev", 00:25:16.552 "req_id": 1 00:25:16.552 } 00:25:16.552 Got JSON-RPC error response 00:25:16.552 response: 00:25:16.552 { 00:25:16.552 "code": -22, 00:25:16.552 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:16.552 } 00:25:16.552 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:25:16.552 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:16.552 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:16.552 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:16.552 13:53:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.493 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.755 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.755 "name": "raid_bdev1", 00:25:17.755 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:17.755 "strip_size_kb": 0, 00:25:17.755 "state": "online", 00:25:17.755 "raid_level": "raid1", 00:25:17.755 "superblock": true, 00:25:17.755 "num_base_bdevs": 2, 00:25:17.755 "num_base_bdevs_discovered": 1, 00:25:17.755 "num_base_bdevs_operational": 1, 00:25:17.755 "base_bdevs_list": [ 00:25:17.755 { 00:25:17.755 "name": null, 00:25:17.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.755 "is_configured": false, 00:25:17.755 "data_offset": 256, 00:25:17.755 "data_size": 7936 00:25:17.755 }, 00:25:17.755 { 00:25:17.755 "name": "BaseBdev2", 00:25:17.755 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:17.755 "is_configured": true, 00:25:17.755 "data_offset": 256, 00:25:17.755 "data_size": 7936 00:25:17.755 } 00:25:17.755 ] 00:25:17.755 }' 00:25:17.755 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.755 13:53:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:18.327 "name": "raid_bdev1", 00:25:18.327 "uuid": "b7aac3df-09dc-4a82-91ea-0471e810d494", 00:25:18.327 "strip_size_kb": 0, 00:25:18.327 "state": "online", 00:25:18.327 "raid_level": "raid1", 00:25:18.327 "superblock": true, 00:25:18.327 "num_base_bdevs": 2, 00:25:18.327 "num_base_bdevs_discovered": 1, 00:25:18.327 "num_base_bdevs_operational": 1, 00:25:18.327 "base_bdevs_list": [ 00:25:18.327 { 00:25:18.327 "name": null, 00:25:18.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.327 "is_configured": false, 00:25:18.327 "data_offset": 256, 00:25:18.327 "data_size": 7936 00:25:18.327 }, 00:25:18.327 { 00:25:18.327 "name": "BaseBdev2", 00:25:18.327 "uuid": "d4eec2ff-fc1c-50fb-a785-de6cebe42441", 00:25:18.327 "is_configured": true, 00:25:18.327 "data_offset": 256, 00:25:18.327 "data_size": 7936 00:25:18.327 } 00:25:18.327 ] 00:25:18.327 }' 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:18.327 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1689947 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 1689947 ']' 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 1689947 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1689947 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1689947' 00:25:18.587 killing process with pid 1689947 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 1689947 00:25:18.587 Received shutdown signal, test time was about 60.000000 seconds 00:25:18.587 00:25:18.587 Latency(us) 00:25:18.587 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:18.587 =================================================================================================================== 00:25:18.587 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:18.587 [2024-06-10 13:53:32.885303] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:18.587 [2024-06-10 13:53:32.885372] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:18.587 [2024-06-10 13:53:32.885402] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:18.587 [2024-06-10 13:53:32.885409] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa51950 name raid_bdev1, state offline 00:25:18.587 13:53:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 1689947 00:25:18.587 [2024-06-10 13:53:32.901321] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:18.587 13:53:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:25:18.587 00:25:18.587 real 0m25.927s 00:25:18.587 user 0m41.345s 00:25:18.587 sys 0m2.741s 00:25:18.587 13:53:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:18.587 13:53:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:25:18.587 ************************************ 00:25:18.587 END TEST raid_rebuild_test_sb_md_interleaved 00:25:18.587 ************************************ 00:25:18.848 13:53:33 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:25:18.848 13:53:33 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:25:18.848 13:53:33 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1689947 ']' 00:25:18.848 13:53:33 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1689947 00:25:18.848 13:53:33 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:25:18.848 00:25:18.848 real 16m10.272s 00:25:18.848 user 27m45.749s 00:25:18.848 sys 2m21.888s 00:25:18.848 13:53:33 bdev_raid -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:18.849 13:53:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:18.849 ************************************ 00:25:18.849 END TEST bdev_raid 00:25:18.849 ************************************ 00:25:18.849 13:53:33 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:25:18.849 13:53:33 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:18.849 13:53:33 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:18.849 13:53:33 -- common/autotest_common.sh@10 -- # set +x 00:25:18.849 ************************************ 00:25:18.849 START TEST bdevperf_config 00:25:18.849 ************************************ 00:25:18.849 13:53:33 bdevperf_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:25:18.849 * Looking for test storage... 00:25:18.849 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:18.849 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:18.849 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:18.849 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:18.849 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:18.849 00:25:18.849 13:53:33 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:19.109 13:53:33 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-06-10 13:53:33.392834] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:21.652 [2024-06-10 13:53:33.392900] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1695362 ] 00:25:21.652 Using job config with 4 jobs 00:25:21.652 [2024-06-10 13:53:33.509240] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:21.652 [2024-06-10 13:53:33.587266] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.652 cpumask for '\''job0'\'' is too big 00:25:21.652 cpumask for '\''job1'\'' is too big 00:25:21.652 cpumask for '\''job2'\'' is too big 00:25:21.652 cpumask for '\''job3'\'' is too big 00:25:21.652 Running I/O for 2 seconds... 00:25:21.652 00:25:21.652 Latency(us) 00:25:21.652 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26646.84 26.02 0.00 0.00 9593.64 1733.97 14854.83 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26624.52 26.00 0.00 0.00 9581.99 1733.97 13107.20 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26602.27 25.98 0.00 0.00 9570.25 1693.01 11468.80 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26580.03 25.96 0.00 0.00 9559.15 1693.01 9830.40 00:25:21.652 =================================================================================================================== 00:25:21.652 Total : 106453.67 103.96 0.00 0.00 9576.26 1693.01 14854.83' 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-06-10 13:53:33.392834] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:21.652 [2024-06-10 13:53:33.392900] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1695362 ] 00:25:21.652 Using job config with 4 jobs 00:25:21.652 [2024-06-10 13:53:33.509240] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:21.652 [2024-06-10 13:53:33.587266] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.652 cpumask for '\''job0'\'' is too big 00:25:21.652 cpumask for '\''job1'\'' is too big 00:25:21.652 cpumask for '\''job2'\'' is too big 00:25:21.652 cpumask for '\''job3'\'' is too big 00:25:21.652 Running I/O for 2 seconds... 00:25:21.652 00:25:21.652 Latency(us) 00:25:21.652 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26646.84 26.02 0.00 0.00 9593.64 1733.97 14854.83 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26624.52 26.00 0.00 0.00 9581.99 1733.97 13107.20 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26602.27 25.98 0.00 0.00 9570.25 1693.01 11468.80 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26580.03 25.96 0.00 0.00 9559.15 1693.01 9830.40 00:25:21.652 =================================================================================================================== 00:25:21.652 Total : 106453.67 103.96 0.00 0.00 9576.26 1693.01 14854.83' 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 13:53:33.392834] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:21.652 [2024-06-10 13:53:33.392900] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1695362 ] 00:25:21.652 Using job config with 4 jobs 00:25:21.652 [2024-06-10 13:53:33.509240] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:21.652 [2024-06-10 13:53:33.587266] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.652 cpumask for '\''job0'\'' is too big 00:25:21.652 cpumask for '\''job1'\'' is too big 00:25:21.652 cpumask for '\''job2'\'' is too big 00:25:21.652 cpumask for '\''job3'\'' is too big 00:25:21.652 Running I/O for 2 seconds... 00:25:21.652 00:25:21.652 Latency(us) 00:25:21.652 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26646.84 26.02 0.00 0.00 9593.64 1733.97 14854.83 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26624.52 26.00 0.00 0.00 9581.99 1733.97 13107.20 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26602.27 25.98 0.00 0.00 9570.25 1693.01 11468.80 00:25:21.652 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:21.652 Malloc0 : 2.02 26580.03 25.96 0.00 0.00 9559.15 1693.01 9830.40 00:25:21.652 =================================================================================================================== 00:25:21.652 Total : 106453.67 103.96 0.00 0.00 9576.26 1693.01 14854.83' 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:25:21.652 13:53:35 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:21.652 [2024-06-10 13:53:35.923583] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:21.652 [2024-06-10 13:53:35.923635] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1695820 ] 00:25:21.652 [2024-06-10 13:53:36.038199] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:21.913 [2024-06-10 13:53:36.137636] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.913 cpumask for 'job0' is too big 00:25:21.913 cpumask for 'job1' is too big 00:25:21.913 cpumask for 'job2' is too big 00:25:21.913 cpumask for 'job3' is too big 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:25:24.455 Running I/O for 2 seconds... 00:25:24.455 00:25:24.455 Latency(us) 00:25:24.455 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:24.455 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:24.455 Malloc0 : 2.02 26633.84 26.01 0.00 0.00 9605.24 1747.63 14854.83 00:25:24.455 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:24.455 Malloc0 : 2.02 26611.36 25.99 0.00 0.00 9593.92 1699.84 13107.20 00:25:24.455 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:24.455 Malloc0 : 2.02 26588.96 25.97 0.00 0.00 9581.60 1713.49 11468.80 00:25:24.455 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:25:24.455 Malloc0 : 2.02 26566.61 25.94 0.00 0.00 9570.08 1699.84 9830.40 00:25:24.455 =================================================================================================================== 00:25:24.455 Total : 106400.76 103.91 0.00 0.00 9587.71 1699.84 14854.83' 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:24.455 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:24.455 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:24.455 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:24.455 13:53:38 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:26.999 13:53:40 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-06-10 13:53:38.498633] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:26.999 [2024-06-10 13:53:38.498686] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696367 ] 00:25:26.999 Using job config with 3 jobs 00:25:26.999 [2024-06-10 13:53:38.609313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:26.999 [2024-06-10 13:53:38.687998] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:26.999 cpumask for '\''job0'\'' is too big 00:25:26.999 cpumask for '\''job1'\'' is too big 00:25:26.999 cpumask for '\''job2'\'' is too big 00:25:26.999 Running I/O for 2 seconds... 00:25:26.999 00:25:26.999 Latency(us) 00:25:26.999 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:26.999 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:26.999 Malloc0 : 2.01 36078.01 35.23 0.00 0.00 7077.52 1693.01 10485.76 00:25:26.999 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.01 36089.06 35.24 0.00 0.00 7060.53 1665.71 8792.75 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.02 36058.73 35.21 0.00 0.00 7052.08 1652.05 7645.87 00:25:27.000 =================================================================================================================== 00:25:27.000 Total : 108225.80 105.69 0.00 0.00 7063.36 1652.05 10485.76' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-06-10 13:53:38.498633] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:27.000 [2024-06-10 13:53:38.498686] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696367 ] 00:25:27.000 Using job config with 3 jobs 00:25:27.000 [2024-06-10 13:53:38.609313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:27.000 [2024-06-10 13:53:38.687998] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:27.000 cpumask for '\''job0'\'' is too big 00:25:27.000 cpumask for '\''job1'\'' is too big 00:25:27.000 cpumask for '\''job2'\'' is too big 00:25:27.000 Running I/O for 2 seconds... 00:25:27.000 00:25:27.000 Latency(us) 00:25:27.000 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.01 36078.01 35.23 0.00 0.00 7077.52 1693.01 10485.76 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.01 36089.06 35.24 0.00 0.00 7060.53 1665.71 8792.75 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.02 36058.73 35.21 0.00 0.00 7052.08 1652.05 7645.87 00:25:27.000 =================================================================================================================== 00:25:27.000 Total : 108225.80 105.69 0.00 0.00 7063.36 1652.05 10485.76' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 13:53:38.498633] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:27.000 [2024-06-10 13:53:38.498686] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696367 ] 00:25:27.000 Using job config with 3 jobs 00:25:27.000 [2024-06-10 13:53:38.609313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:27.000 [2024-06-10 13:53:38.687998] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:27.000 cpumask for '\''job0'\'' is too big 00:25:27.000 cpumask for '\''job1'\'' is too big 00:25:27.000 cpumask for '\''job2'\'' is too big 00:25:27.000 Running I/O for 2 seconds... 00:25:27.000 00:25:27.000 Latency(us) 00:25:27.000 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.01 36078.01 35.23 0.00 0.00 7077.52 1693.01 10485.76 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.01 36089.06 35.24 0.00 0.00 7060.53 1665.71 8792.75 00:25:27.000 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:25:27.000 Malloc0 : 2.02 36058.73 35.21 0.00 0.00 7052.08 1652.05 7645.87 00:25:27.000 =================================================================================================================== 00:25:27.000 Total : 108225.80 105.69 0.00 0.00 7063.36 1652.05 10485.76' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:27.000 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:27.000 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:27.000 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:27.000 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:25:27.000 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:25:27.000 13:53:40 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:29.543 13:53:43 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-06-10 13:53:41.064003] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:29.543 [2024-06-10 13:53:41.064065] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696788 ] 00:25:29.543 Using job config with 4 jobs 00:25:29.543 [2024-06-10 13:53:41.181626] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.543 [2024-06-10 13:53:41.261517] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.543 cpumask for '\''job0'\'' is too big 00:25:29.543 cpumask for '\''job1'\'' is too big 00:25:29.543 cpumask for '\''job2'\'' is too big 00:25:29.543 cpumask for '\''job3'\'' is too big 00:25:29.543 Running I/O for 2 seconds... 00:25:29.543 00:25:29.543 Latency(us) 00:25:29.543 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:29.543 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc0 : 2.03 13241.33 12.93 0.00 0.00 19308.83 3495.25 29928.11 00:25:29.543 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc1 : 2.03 13230.01 12.92 0.00 0.00 19307.91 4205.23 29928.11 00:25:29.543 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc0 : 2.03 13218.99 12.91 0.00 0.00 19261.52 3467.95 26432.85 00:25:29.543 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc1 : 2.04 13207.74 12.90 0.00 0.00 19262.06 4177.92 26432.85 00:25:29.543 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc0 : 2.04 13196.78 12.89 0.00 0.00 19216.86 3467.95 22937.60 00:25:29.543 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc1 : 2.04 13185.61 12.88 0.00 0.00 19216.89 4177.92 22937.60 00:25:29.543 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc0 : 2.05 13268.42 12.96 0.00 0.00 19034.05 3263.15 19660.80 00:25:29.543 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.543 Malloc1 : 2.05 13257.20 12.95 0.00 0.00 19033.28 2594.13 19660.80 00:25:29.543 =================================================================================================================== 00:25:29.543 Total : 105806.08 103.33 0.00 0.00 19204.77 2594.13 29928.11' 00:25:29.543 13:53:43 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-06-10 13:53:41.064003] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:29.543 [2024-06-10 13:53:41.064065] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696788 ] 00:25:29.543 Using job config with 4 jobs 00:25:29.543 [2024-06-10 13:53:41.181626] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.543 [2024-06-10 13:53:41.261517] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.543 cpumask for '\''job0'\'' is too big 00:25:29.543 cpumask for '\''job1'\'' is too big 00:25:29.543 cpumask for '\''job2'\'' is too big 00:25:29.543 cpumask for '\''job3'\'' is too big 00:25:29.543 Running I/O for 2 seconds... 00:25:29.543 00:25:29.543 Latency(us) 00:25:29.543 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:29.543 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.03 13241.33 12.93 0.00 0.00 19308.83 3495.25 29928.11 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.03 13230.01 12.92 0.00 0.00 19307.91 4205.23 29928.11 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.03 13218.99 12.91 0.00 0.00 19261.52 3467.95 26432.85 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.04 13207.74 12.90 0.00 0.00 19262.06 4177.92 26432.85 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.04 13196.78 12.89 0.00 0.00 19216.86 3467.95 22937.60 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.04 13185.61 12.88 0.00 0.00 19216.89 4177.92 22937.60 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.05 13268.42 12.96 0.00 0.00 19034.05 3263.15 19660.80 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.05 13257.20 12.95 0.00 0.00 19033.28 2594.13 19660.80 00:25:29.544 =================================================================================================================== 00:25:29.544 Total : 105806.08 103.33 0.00 0.00 19204.77 2594.13 29928.11' 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 13:53:41.064003] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:29.544 [2024-06-10 13:53:41.064065] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1696788 ] 00:25:29.544 Using job config with 4 jobs 00:25:29.544 [2024-06-10 13:53:41.181626] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.544 [2024-06-10 13:53:41.261517] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.544 cpumask for '\''job0'\'' is too big 00:25:29.544 cpumask for '\''job1'\'' is too big 00:25:29.544 cpumask for '\''job2'\'' is too big 00:25:29.544 cpumask for '\''job3'\'' is too big 00:25:29.544 Running I/O for 2 seconds... 00:25:29.544 00:25:29.544 Latency(us) 00:25:29.544 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.03 13241.33 12.93 0.00 0.00 19308.83 3495.25 29928.11 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.03 13230.01 12.92 0.00 0.00 19307.91 4205.23 29928.11 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.03 13218.99 12.91 0.00 0.00 19261.52 3467.95 26432.85 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.04 13207.74 12.90 0.00 0.00 19262.06 4177.92 26432.85 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.04 13196.78 12.89 0.00 0.00 19216.86 3467.95 22937.60 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.04 13185.61 12.88 0.00 0.00 19216.89 4177.92 22937.60 00:25:29.544 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc0 : 2.05 13268.42 12.96 0.00 0.00 19034.05 3263.15 19660.80 00:25:29.544 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:25:29.544 Malloc1 : 2.05 13257.20 12.95 0.00 0.00 19033.28 2594.13 19660.80 00:25:29.544 =================================================================================================================== 00:25:29.544 Total : 105806.08 103.33 0.00 0.00 19204.77 2594.13 29928.11' 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:25:29.544 13:53:43 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:25:29.544 00:25:29.544 real 0m10.409s 00:25:29.544 user 0m9.390s 00:25:29.544 sys 0m0.823s 00:25:29.544 13:53:43 bdevperf_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:29.544 13:53:43 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:25:29.544 ************************************ 00:25:29.544 END TEST bdevperf_config 00:25:29.544 ************************************ 00:25:29.544 13:53:43 -- spdk/autotest.sh@192 -- # uname -s 00:25:29.544 13:53:43 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:25:29.544 13:53:43 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:29.544 13:53:43 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:29.544 13:53:43 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:29.544 13:53:43 -- common/autotest_common.sh@10 -- # set +x 00:25:29.544 ************************************ 00:25:29.544 START TEST reactor_set_interrupt 00:25:29.544 ************************************ 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:29.544 * Looking for test storage... 00:25:29.544 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:29.544 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:25:29.544 13:53:43 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:25:29.544 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:25:29.544 13:53:43 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:25:29.544 #define SPDK_CONFIG_H 00:25:29.544 #define SPDK_CONFIG_APPS 1 00:25:29.544 #define SPDK_CONFIG_ARCH native 00:25:29.544 #undef SPDK_CONFIG_ASAN 00:25:29.544 #undef SPDK_CONFIG_AVAHI 00:25:29.544 #undef SPDK_CONFIG_CET 00:25:29.544 #define SPDK_CONFIG_COVERAGE 1 00:25:29.544 #define SPDK_CONFIG_CROSS_PREFIX 00:25:29.544 #define SPDK_CONFIG_CRYPTO 1 00:25:29.544 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:25:29.544 #undef SPDK_CONFIG_CUSTOMOCF 00:25:29.544 #undef SPDK_CONFIG_DAOS 00:25:29.544 #define SPDK_CONFIG_DAOS_DIR 00:25:29.544 #define SPDK_CONFIG_DEBUG 1 00:25:29.544 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:25:29.544 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:29.544 #define SPDK_CONFIG_DPDK_INC_DIR 00:25:29.544 #define SPDK_CONFIG_DPDK_LIB_DIR 00:25:29.544 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:25:29.544 #undef SPDK_CONFIG_DPDK_UADK 00:25:29.544 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:29.544 #define SPDK_CONFIG_EXAMPLES 1 00:25:29.544 #undef SPDK_CONFIG_FC 00:25:29.544 #define SPDK_CONFIG_FC_PATH 00:25:29.544 #define SPDK_CONFIG_FIO_PLUGIN 1 00:25:29.544 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:25:29.544 #undef SPDK_CONFIG_FUSE 00:25:29.544 #undef SPDK_CONFIG_FUZZER 00:25:29.544 #define SPDK_CONFIG_FUZZER_LIB 00:25:29.544 #undef SPDK_CONFIG_GOLANG 00:25:29.544 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:25:29.544 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:25:29.544 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:25:29.544 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:25:29.544 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:25:29.544 #undef SPDK_CONFIG_HAVE_LIBBSD 00:25:29.544 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:25:29.544 #define SPDK_CONFIG_IDXD 1 00:25:29.544 #define SPDK_CONFIG_IDXD_KERNEL 1 00:25:29.544 #define SPDK_CONFIG_IPSEC_MB 1 00:25:29.544 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:29.544 #define SPDK_CONFIG_ISAL 1 00:25:29.544 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:25:29.544 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:25:29.544 #define SPDK_CONFIG_LIBDIR 00:25:29.544 #undef SPDK_CONFIG_LTO 00:25:29.544 #define SPDK_CONFIG_MAX_LCORES 00:25:29.544 #define SPDK_CONFIG_NVME_CUSE 1 00:25:29.544 #undef SPDK_CONFIG_OCF 00:25:29.544 #define SPDK_CONFIG_OCF_PATH 00:25:29.544 #define SPDK_CONFIG_OPENSSL_PATH 00:25:29.544 #undef SPDK_CONFIG_PGO_CAPTURE 00:25:29.544 #define SPDK_CONFIG_PGO_DIR 00:25:29.544 #undef SPDK_CONFIG_PGO_USE 00:25:29.544 #define SPDK_CONFIG_PREFIX /usr/local 00:25:29.544 #undef SPDK_CONFIG_RAID5F 00:25:29.544 #undef SPDK_CONFIG_RBD 00:25:29.544 #define SPDK_CONFIG_RDMA 1 00:25:29.545 #define SPDK_CONFIG_RDMA_PROV verbs 00:25:29.545 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:25:29.545 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:25:29.545 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:25:29.545 #define SPDK_CONFIG_SHARED 1 00:25:29.545 #undef SPDK_CONFIG_SMA 00:25:29.545 #define SPDK_CONFIG_TESTS 1 00:25:29.545 #undef SPDK_CONFIG_TSAN 00:25:29.545 #define SPDK_CONFIG_UBLK 1 00:25:29.545 #define SPDK_CONFIG_UBSAN 1 00:25:29.545 #undef SPDK_CONFIG_UNIT_TESTS 00:25:29.545 #undef SPDK_CONFIG_URING 00:25:29.545 #define SPDK_CONFIG_URING_PATH 00:25:29.545 #undef SPDK_CONFIG_URING_ZNS 00:25:29.545 #undef SPDK_CONFIG_USDT 00:25:29.545 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:25:29.545 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:25:29.545 #undef SPDK_CONFIG_VFIO_USER 00:25:29.545 #define SPDK_CONFIG_VFIO_USER_DIR 00:25:29.545 #define SPDK_CONFIG_VHOST 1 00:25:29.545 #define SPDK_CONFIG_VIRTIO 1 00:25:29.545 #undef SPDK_CONFIG_VTUNE 00:25:29.545 #define SPDK_CONFIG_VTUNE_DIR 00:25:29.545 #define SPDK_CONFIG_WERROR 1 00:25:29.545 #define SPDK_CONFIG_WPDK_DIR 00:25:29.545 #undef SPDK_CONFIG_XNVME 00:25:29.545 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:25:29.545 13:53:43 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:25:29.545 13:53:43 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:29.545 13:53:43 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:29.545 13:53:43 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:29.545 13:53:43 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:25:29.545 13:53:43 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:25:29.545 13:53:43 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:29.545 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j144 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1697408 ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1697408 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.ut36ML 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.ut36ML/tests/interrupt /tmp/spdk.ut36ML 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=123469434880 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=134655774720 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=11186339840 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67323174912 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67327885312 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=26921172992 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=26931154944 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9981952 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=96256 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=407552 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67327320064 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67327889408 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=569344 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=13465571328 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=13465575424 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:25:29.546 * Looking for test storage... 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=123469434880 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=13400932352 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.546 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1681 -- # set -o errtrace 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # true 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@1688 -- # xtrace_fd 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1697451 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1697451 /var/tmp/spdk.sock 00:25:29.546 13:53:43 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 1697451 ']' 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:29.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:29.546 13:53:43 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:25:29.546 [2024-06-10 13:53:43.990448] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:29.546 [2024-06-10 13:53:43.990511] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1697451 ] 00:25:29.806 [2024-06-10 13:53:44.084929] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:29.806 [2024-06-10 13:53:44.150712] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:25:29.806 [2024-06-10 13:53:44.150864] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:25:29.806 [2024-06-10 13:53:44.150867] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.806 [2024-06-10 13:53:44.202667] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:30.376 13:53:44 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:30.376 13:53:44 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:25:30.376 13:53:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:25:30.376 13:53:44 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:30.636 Malloc0 00:25:30.636 Malloc1 00:25:30.636 Malloc2 00:25:30.636 13:53:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:25:30.636 13:53:44 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:25:30.636 13:53:44 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:25:30.636 13:53:44 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:25:30.636 5000+0 records in 00:25:30.636 5000+0 records out 00:25:30.636 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0182752 s, 560 MB/s 00:25:30.636 13:53:44 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:25:30.896 AIO0 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1697451 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1697451 without_thd 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1697451 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:30.896 13:53:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:25:31.157 spdk_thread ids are 1 on reactor0. 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1697451 0 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1697451 0 idle 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:31.157 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697451 root 20 0 128.2g 35712 23040 S 6.7 0.0 0:00.30 reactor_0' 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697451 root 20 0 128.2g 35712 23040 S 6.7 0.0 0:00.30 reactor_0 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1697451 1 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1697451 1 idle 00:25:31.416 13:53:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:31.417 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:25:31.676 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697455 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1' 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697455 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1697451 2 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1697451 2 idle 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:31.677 13:53:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697456 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2' 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697456 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:25:31.677 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:25:31.937 [2024-06-10 13:53:46.303807] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:31.937 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:25:32.197 [2024-06-10 13:53:46.515358] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:25:32.197 [2024-06-10 13:53:46.515916] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:32.197 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:25:32.458 [2024-06-10 13:53:46.683328] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:25:32.458 [2024-06-10 13:53:46.683767] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1697451 0 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1697451 0 busy 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697451 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.67 reactor_0' 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697451 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.67 reactor_0 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1697451 2 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1697451 2 busy 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:32.458 13:53:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697456 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.36 reactor_2' 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697456 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.36 reactor_2 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:32.718 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:25:32.978 [2024-06-10 13:53:47.255324] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:25:32.978 [2024-06-10 13:53:47.255402] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1697451 2 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1697451 2 idle 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697456 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.57 reactor_2' 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697456 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.57 reactor_2 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:32.978 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:25:33.238 [2024-06-10 13:53:47.635329] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:25:33.238 [2024-06-10 13:53:47.635660] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:33.238 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:25:33.238 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:25:33.238 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:25:33.498 [2024-06-10 13:53:47.843730] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1697451 0 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1697451 0 idle 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1697451 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1697451 -w 256 00:25:33.498 13:53:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1697451 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.43 reactor_0' 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1697451 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.43 reactor_0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:25:33.757 13:53:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1697451 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 1697451 ']' 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 1697451 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1697451 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1697451' 00:25:33.757 killing process with pid 1697451 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 1697451 00:25:33.757 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 1697451 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1698410 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1698410 /var/tmp/spdk.sock 00:25:34.017 13:53:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:25:34.017 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 1698410 ']' 00:25:34.017 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:34.017 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:34.017 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:34.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:34.017 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:34.017 13:53:48 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:25:34.017 [2024-06-10 13:53:48.278701] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:34.017 [2024-06-10 13:53:48.278750] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1698410 ] 00:25:34.017 [2024-06-10 13:53:48.365824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:34.017 [2024-06-10 13:53:48.431269] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:25:34.017 [2024-06-10 13:53:48.431426] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:25:34.017 [2024-06-10 13:53:48.431432] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:34.017 [2024-06-10 13:53:48.482970] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:34.587 13:53:49 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:34.587 13:53:49 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:25:34.587 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:25:34.587 13:53:49 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:34.847 Malloc0 00:25:34.847 Malloc1 00:25:34.847 Malloc2 00:25:34.847 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:25:34.847 13:53:49 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:25:34.847 13:53:49 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:25:34.847 13:53:49 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:25:34.847 5000+0 records in 00:25:34.847 5000+0 records out 00:25:34.847 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0158312 s, 647 MB/s 00:25:34.847 13:53:49 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:25:35.107 AIO0 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1698410 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1698410 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1698410 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:25:35.107 13:53:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:25:35.368 spdk_thread ids are 1 on reactor0. 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1698410 0 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1698410 0 idle 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:35.368 13:53:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698410 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.29 reactor_0' 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698410 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.29 reactor_0 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1698410 1 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1698410 1 idle 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:35.630 13:53:49 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698452 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698452 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1698410 2 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1698410 2 idle 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698453 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698453 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:25:35.891 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:25:36.152 [2024-06-10 13:53:50.511908] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:25:36.152 [2024-06-10 13:53:50.512082] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:25:36.152 [2024-06-10 13:53:50.512289] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:36.152 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:25:36.441 [2024-06-10 13:53:50.668147] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:25:36.441 [2024-06-10 13:53:50.668436] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1698410 0 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1698410 0 busy 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698410 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.63 reactor_0' 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698410 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.63 reactor_0 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1698410 2 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1698410 2 busy 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:36.441 13:53:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698453 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.35 reactor_2' 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698453 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.35 reactor_2 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:25:36.794 [2024-06-10 13:53:51.225604] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:25:36.794 [2024-06-10 13:53:51.225755] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1698410 2 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1698410 2 idle 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:36.794 13:53:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698453 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.55 reactor_2' 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698453 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.55 reactor_2 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:37.054 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:25:37.314 [2024-06-10 13:53:51.606550] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:25:37.314 [2024-06-10 13:53:51.606848] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:25:37.314 [2024-06-10 13:53:51.606861] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1698410 0 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1698410 0 idle 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1698410 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1698410 -w 256 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1698410 root 20 0 128.2g 35712 23040 S 6.7 0.0 0:01.39 reactor_0' 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1698410 root 20 0 128.2g 35712 23040 S 6.7 0.0 0:01.39 reactor_0 00:25:37.314 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:25:37.315 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:25:37.575 13:53:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1698410 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 1698410 ']' 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 1698410 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1698410 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1698410' 00:25:37.575 killing process with pid 1698410 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 1698410 00:25:37.575 13:53:51 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 1698410 00:25:37.575 13:53:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:25:37.575 13:53:52 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:25:37.575 00:25:37.575 real 0m8.354s 00:25:37.575 user 0m7.736s 00:25:37.575 sys 0m1.548s 00:25:37.575 13:53:52 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:37.575 13:53:52 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:25:37.575 ************************************ 00:25:37.575 END TEST reactor_set_interrupt 00:25:37.575 ************************************ 00:25:37.837 13:53:52 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:25:37.837 13:53:52 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:25:37.837 13:53:52 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:37.837 13:53:52 -- common/autotest_common.sh@10 -- # set +x 00:25:37.837 ************************************ 00:25:37.837 START TEST reap_unregistered_poller 00:25:37.837 ************************************ 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:25:37.837 * Looking for test storage... 00:25:37.837 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:37.837 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:25:37.837 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:25:37.837 13:53:52 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:25:37.838 13:53:52 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:25:37.838 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:25:37.838 #define SPDK_CONFIG_H 00:25:37.838 #define SPDK_CONFIG_APPS 1 00:25:37.838 #define SPDK_CONFIG_ARCH native 00:25:37.838 #undef SPDK_CONFIG_ASAN 00:25:37.838 #undef SPDK_CONFIG_AVAHI 00:25:37.838 #undef SPDK_CONFIG_CET 00:25:37.838 #define SPDK_CONFIG_COVERAGE 1 00:25:37.838 #define SPDK_CONFIG_CROSS_PREFIX 00:25:37.838 #define SPDK_CONFIG_CRYPTO 1 00:25:37.838 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:25:37.838 #undef SPDK_CONFIG_CUSTOMOCF 00:25:37.838 #undef SPDK_CONFIG_DAOS 00:25:37.838 #define SPDK_CONFIG_DAOS_DIR 00:25:37.838 #define SPDK_CONFIG_DEBUG 1 00:25:37.838 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:25:37.838 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:25:37.838 #define SPDK_CONFIG_DPDK_INC_DIR 00:25:37.838 #define SPDK_CONFIG_DPDK_LIB_DIR 00:25:37.838 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:25:37.838 #undef SPDK_CONFIG_DPDK_UADK 00:25:37.838 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:25:37.838 #define SPDK_CONFIG_EXAMPLES 1 00:25:37.838 #undef SPDK_CONFIG_FC 00:25:37.838 #define SPDK_CONFIG_FC_PATH 00:25:37.838 #define SPDK_CONFIG_FIO_PLUGIN 1 00:25:37.838 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:25:37.838 #undef SPDK_CONFIG_FUSE 00:25:37.838 #undef SPDK_CONFIG_FUZZER 00:25:37.838 #define SPDK_CONFIG_FUZZER_LIB 00:25:37.838 #undef SPDK_CONFIG_GOLANG 00:25:37.838 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:25:37.838 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:25:37.838 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:25:37.838 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:25:37.838 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:25:37.838 #undef SPDK_CONFIG_HAVE_LIBBSD 00:25:37.838 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:25:37.838 #define SPDK_CONFIG_IDXD 1 00:25:37.838 #define SPDK_CONFIG_IDXD_KERNEL 1 00:25:37.838 #define SPDK_CONFIG_IPSEC_MB 1 00:25:37.838 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:25:37.838 #define SPDK_CONFIG_ISAL 1 00:25:37.838 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:25:37.838 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:25:37.838 #define SPDK_CONFIG_LIBDIR 00:25:37.838 #undef SPDK_CONFIG_LTO 00:25:37.838 #define SPDK_CONFIG_MAX_LCORES 00:25:37.838 #define SPDK_CONFIG_NVME_CUSE 1 00:25:37.838 #undef SPDK_CONFIG_OCF 00:25:37.838 #define SPDK_CONFIG_OCF_PATH 00:25:37.838 #define SPDK_CONFIG_OPENSSL_PATH 00:25:37.838 #undef SPDK_CONFIG_PGO_CAPTURE 00:25:37.838 #define SPDK_CONFIG_PGO_DIR 00:25:37.838 #undef SPDK_CONFIG_PGO_USE 00:25:37.838 #define SPDK_CONFIG_PREFIX /usr/local 00:25:37.838 #undef SPDK_CONFIG_RAID5F 00:25:37.838 #undef SPDK_CONFIG_RBD 00:25:37.838 #define SPDK_CONFIG_RDMA 1 00:25:37.838 #define SPDK_CONFIG_RDMA_PROV verbs 00:25:37.838 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:25:37.838 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:25:37.838 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:25:37.838 #define SPDK_CONFIG_SHARED 1 00:25:37.838 #undef SPDK_CONFIG_SMA 00:25:37.838 #define SPDK_CONFIG_TESTS 1 00:25:37.838 #undef SPDK_CONFIG_TSAN 00:25:37.838 #define SPDK_CONFIG_UBLK 1 00:25:37.838 #define SPDK_CONFIG_UBSAN 1 00:25:37.838 #undef SPDK_CONFIG_UNIT_TESTS 00:25:37.838 #undef SPDK_CONFIG_URING 00:25:37.838 #define SPDK_CONFIG_URING_PATH 00:25:37.838 #undef SPDK_CONFIG_URING_ZNS 00:25:37.838 #undef SPDK_CONFIG_USDT 00:25:37.838 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:25:37.838 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:25:37.838 #undef SPDK_CONFIG_VFIO_USER 00:25:37.838 #define SPDK_CONFIG_VFIO_USER_DIR 00:25:37.838 #define SPDK_CONFIG_VHOST 1 00:25:37.838 #define SPDK_CONFIG_VIRTIO 1 00:25:37.838 #undef SPDK_CONFIG_VTUNE 00:25:37.838 #define SPDK_CONFIG_VTUNE_DIR 00:25:37.838 #define SPDK_CONFIG_WERROR 1 00:25:37.838 #define SPDK_CONFIG_WPDK_DIR 00:25:37.838 #undef SPDK_CONFIG_XNVME 00:25:37.838 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:25:37.838 13:53:52 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:25:37.838 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:25:37.838 13:53:52 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:25:37.838 13:53:52 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:25:37.838 13:53:52 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:25:37.838 13:53:52 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:37.838 13:53:52 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:37.838 13:53:52 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:37.838 13:53:52 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:25:37.839 13:53:52 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:25:37.839 13:53:52 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:25:37.839 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j144 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1699277 ]] 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1699277 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.VEnD48 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.VEnD48/tests/interrupt /tmp/spdk.VEnD48 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:25:37.840 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=123469250560 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=134655774720 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=11186524160 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67323174912 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67327885312 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=26921177088 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=26931154944 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9977856 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=96256 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=407552 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67327320064 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67327889408 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=569344 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=13465571328 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=13465575424 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:25:37.841 * Looking for test storage... 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=123469250560 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=13401116672 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.841 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1681 -- # set -o errtrace 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # true 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@1688 -- # xtrace_fd 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:25:37.841 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1699321 00:25:37.841 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:37.842 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:25:37.842 13:53:52 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1699321 /var/tmp/spdk.sock 00:25:37.842 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@830 -- # '[' -z 1699321 ']' 00:25:37.842 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:37.842 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:37.842 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:37.842 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:37.842 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:37.842 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:25:38.102 [2024-06-10 13:53:52.310703] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:38.102 [2024-06-10 13:53:52.310744] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1699321 ] 00:25:38.102 [2024-06-10 13:53:52.388411] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:38.102 [2024-06-10 13:53:52.454719] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:25:38.102 [2024-06-10 13:53:52.454873] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:25:38.102 [2024-06-10 13:53:52.454876] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:25:38.102 [2024-06-10 13:53:52.506730] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:25:38.102 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:38.102 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@863 -- # return 0 00:25:38.102 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:25:38.102 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:38.102 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:25:38.102 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:25:38.102 13:53:52 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:38.102 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:25:38.102 "name": "app_thread", 00:25:38.102 "id": 1, 00:25:38.102 "active_pollers": [], 00:25:38.102 "timed_pollers": [ 00:25:38.102 { 00:25:38.102 "name": "rpc_subsystem_poll_servers", 00:25:38.102 "id": 1, 00:25:38.102 "state": "waiting", 00:25:38.102 "run_count": 0, 00:25:38.102 "busy_count": 0, 00:25:38.102 "period_ticks": 9600000 00:25:38.102 } 00:25:38.102 ], 00:25:38.102 "paused_pollers": [] 00:25:38.102 }' 00:25:38.102 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:25:38.363 5000+0 records in 00:25:38.363 5000+0 records out 00:25:38.363 10240000 bytes (10 MB, 9.8 MiB) copied, 0.00534405 s, 1.9 GB/s 00:25:38.363 13:53:52 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:25:38.624 AIO0 00:25:38.624 13:53:52 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:38.624 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:25:38.885 "name": "app_thread", 00:25:38.885 "id": 1, 00:25:38.885 "active_pollers": [], 00:25:38.885 "timed_pollers": [ 00:25:38.885 { 00:25:38.885 "name": "rpc_subsystem_poll_servers", 00:25:38.885 "id": 1, 00:25:38.885 "state": "waiting", 00:25:38.885 "run_count": 0, 00:25:38.885 "busy_count": 0, 00:25:38.885 "period_ticks": 9600000 00:25:38.885 } 00:25:38.885 ], 00:25:38.885 "paused_pollers": [] 00:25:38.885 }' 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:25:38.885 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1699321 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@949 -- # '[' -z 1699321 ']' 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@953 -- # kill -0 1699321 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@954 -- # uname 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1699321 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1699321' 00:25:38.885 killing process with pid 1699321 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@968 -- # kill 1699321 00:25:38.885 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@973 -- # wait 1699321 00:25:39.146 13:53:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:25:39.146 13:53:53 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:25:39.146 00:25:39.146 real 0m1.338s 00:25:39.146 user 0m0.985s 00:25:39.146 sys 0m0.425s 00:25:39.146 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:39.146 13:53:53 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:25:39.146 ************************************ 00:25:39.146 END TEST reap_unregistered_poller 00:25:39.146 ************************************ 00:25:39.146 13:53:53 -- spdk/autotest.sh@198 -- # uname -s 00:25:39.146 13:53:53 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:25:39.146 13:53:53 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:25:39.146 13:53:53 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:25:39.146 13:53:53 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@260 -- # timing_exit lib 00:25:39.146 13:53:53 -- common/autotest_common.sh@729 -- # xtrace_disable 00:25:39.146 13:53:53 -- common/autotest_common.sh@10 -- # set +x 00:25:39.146 13:53:53 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:25:39.146 13:53:53 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:25:39.146 13:53:53 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:25:39.146 13:53:53 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:39.146 13:53:53 -- common/autotest_common.sh@10 -- # set +x 00:25:39.146 ************************************ 00:25:39.146 START TEST compress_compdev 00:25:39.146 ************************************ 00:25:39.146 13:53:53 compress_compdev -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:25:39.407 * Looking for test storage... 00:25:39.407 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:25:39.407 13:53:53 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:806f5428-4aec-ec11-9bc7-a4bf01928306 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=806f5428-4aec-ec11-9bc7-a4bf01928306 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:25:39.407 13:53:53 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:25:39.407 13:53:53 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:25:39.407 13:53:53 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:25:39.407 13:53:53 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:25:39.407 13:53:53 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:39.407 13:53:53 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:39.407 13:53:53 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:39.408 13:53:53 compress_compdev -- paths/export.sh@5 -- # export PATH 00:25:39.408 13:53:53 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:25:39.408 13:53:53 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1699736 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1699736 00:25:39.408 13:53:53 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1699736 ']' 00:25:39.408 13:53:53 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:39.408 13:53:53 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:25:39.408 13:53:53 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:39.408 13:53:53 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:39.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:39.408 13:53:53 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:39.408 13:53:53 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:39.408 [2024-06-10 13:53:53.757032] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:39.408 [2024-06-10 13:53:53.757092] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1699736 ] 00:25:39.408 [2024-06-10 13:53:53.833902] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:39.669 [2024-06-10 13:53:53.906956] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:25:39.669 [2024-06-10 13:53:53.906962] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:25:39.929 [2024-06-10 13:53:54.320995] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:40.190 13:53:54 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:40.190 13:53:54 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:25:40.190 13:53:54 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:25:40.190 13:53:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:40.190 13:53:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:40.761 [2024-06-10 13:53:55.140513] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1d52130 PMD being used: compress_qat 00:25:40.761 13:53:55 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:40.761 13:53:55 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:25:40.761 13:53:55 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:40.761 13:53:55 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:40.761 13:53:55 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:40.761 13:53:55 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:40.762 13:53:55 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:41.021 13:53:55 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:41.282 [ 00:25:41.282 { 00:25:41.282 "name": "Nvme0n1", 00:25:41.282 "aliases": [ 00:25:41.282 "36344730-5260-5504-0025-3845000000bb" 00:25:41.282 ], 00:25:41.282 "product_name": "NVMe disk", 00:25:41.282 "block_size": 512, 00:25:41.282 "num_blocks": 3750748848, 00:25:41.282 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:25:41.282 "assigned_rate_limits": { 00:25:41.282 "rw_ios_per_sec": 0, 00:25:41.282 "rw_mbytes_per_sec": 0, 00:25:41.282 "r_mbytes_per_sec": 0, 00:25:41.282 "w_mbytes_per_sec": 0 00:25:41.282 }, 00:25:41.282 "claimed": false, 00:25:41.282 "zoned": false, 00:25:41.282 "supported_io_types": { 00:25:41.282 "read": true, 00:25:41.282 "write": true, 00:25:41.282 "unmap": true, 00:25:41.282 "write_zeroes": true, 00:25:41.282 "flush": true, 00:25:41.282 "reset": true, 00:25:41.282 "compare": true, 00:25:41.282 "compare_and_write": false, 00:25:41.282 "abort": true, 00:25:41.282 "nvme_admin": true, 00:25:41.282 "nvme_io": true 00:25:41.282 }, 00:25:41.282 "driver_specific": { 00:25:41.282 "nvme": [ 00:25:41.282 { 00:25:41.282 "pci_address": "0000:65:00.0", 00:25:41.282 "trid": { 00:25:41.282 "trtype": "PCIe", 00:25:41.282 "traddr": "0000:65:00.0" 00:25:41.282 }, 00:25:41.282 "ctrlr_data": { 00:25:41.282 "cntlid": 6, 00:25:41.282 "vendor_id": "0x144d", 00:25:41.282 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:25:41.282 "serial_number": "S64GNE0R605504", 00:25:41.282 "firmware_revision": "GDC5302Q", 00:25:41.282 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:25:41.282 "oacs": { 00:25:41.282 "security": 1, 00:25:41.282 "format": 1, 00:25:41.282 "firmware": 1, 00:25:41.282 "ns_manage": 1 00:25:41.282 }, 00:25:41.282 "multi_ctrlr": false, 00:25:41.282 "ana_reporting": false 00:25:41.282 }, 00:25:41.282 "vs": { 00:25:41.282 "nvme_version": "1.4" 00:25:41.282 }, 00:25:41.282 "ns_data": { 00:25:41.282 "id": 1, 00:25:41.282 "can_share": false 00:25:41.282 }, 00:25:41.282 "security": { 00:25:41.282 "opal": true 00:25:41.282 } 00:25:41.282 } 00:25:41.282 ], 00:25:41.282 "mp_policy": "active_passive" 00:25:41.282 } 00:25:41.282 } 00:25:41.282 ] 00:25:41.282 13:53:55 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:25:41.282 13:53:55 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:41.282 [2024-06-10 13:53:55.739852] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ba0290 PMD being used: compress_qat 00:25:42.229 04d096e8-da9a-4376-abf2-c6b8fbbc693a 00:25:42.229 13:53:56 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:42.229 b20cc108-ed31-4086-9705-7eaade5de2bd 00:25:42.229 13:53:56 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:42.229 13:53:56 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:25:42.229 13:53:56 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:42.229 13:53:56 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:42.229 13:53:56 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:42.229 13:53:56 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:42.229 13:53:56 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:42.489 13:53:56 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:42.749 [ 00:25:42.749 { 00:25:42.749 "name": "b20cc108-ed31-4086-9705-7eaade5de2bd", 00:25:42.749 "aliases": [ 00:25:42.749 "lvs0/lv0" 00:25:42.749 ], 00:25:42.749 "product_name": "Logical Volume", 00:25:42.749 "block_size": 512, 00:25:42.749 "num_blocks": 204800, 00:25:42.749 "uuid": "b20cc108-ed31-4086-9705-7eaade5de2bd", 00:25:42.749 "assigned_rate_limits": { 00:25:42.749 "rw_ios_per_sec": 0, 00:25:42.749 "rw_mbytes_per_sec": 0, 00:25:42.749 "r_mbytes_per_sec": 0, 00:25:42.749 "w_mbytes_per_sec": 0 00:25:42.749 }, 00:25:42.749 "claimed": false, 00:25:42.749 "zoned": false, 00:25:42.749 "supported_io_types": { 00:25:42.749 "read": true, 00:25:42.749 "write": true, 00:25:42.749 "unmap": true, 00:25:42.749 "write_zeroes": true, 00:25:42.749 "flush": false, 00:25:42.749 "reset": true, 00:25:42.749 "compare": false, 00:25:42.749 "compare_and_write": false, 00:25:42.749 "abort": false, 00:25:42.749 "nvme_admin": false, 00:25:42.749 "nvme_io": false 00:25:42.749 }, 00:25:42.749 "driver_specific": { 00:25:42.749 "lvol": { 00:25:42.749 "lvol_store_uuid": "04d096e8-da9a-4376-abf2-c6b8fbbc693a", 00:25:42.749 "base_bdev": "Nvme0n1", 00:25:42.749 "thin_provision": true, 00:25:42.749 "num_allocated_clusters": 0, 00:25:42.749 "snapshot": false, 00:25:42.749 "clone": false, 00:25:42.749 "esnap_clone": false 00:25:42.749 } 00:25:42.749 } 00:25:42.749 } 00:25:42.749 ] 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:25:42.749 13:53:57 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:25:42.749 13:53:57 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:25:42.749 [2024-06-10 13:53:57.201386] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:42.749 COMP_lvs0/lv0 00:25:42.749 13:53:57 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:42.749 13:53:57 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:43.010 13:53:57 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:43.270 [ 00:25:43.271 { 00:25:43.271 "name": "COMP_lvs0/lv0", 00:25:43.271 "aliases": [ 00:25:43.271 "0eb02bf8-035f-5f91-9048-10915cba24bb" 00:25:43.271 ], 00:25:43.271 "product_name": "compress", 00:25:43.271 "block_size": 512, 00:25:43.271 "num_blocks": 200704, 00:25:43.271 "uuid": "0eb02bf8-035f-5f91-9048-10915cba24bb", 00:25:43.271 "assigned_rate_limits": { 00:25:43.271 "rw_ios_per_sec": 0, 00:25:43.271 "rw_mbytes_per_sec": 0, 00:25:43.271 "r_mbytes_per_sec": 0, 00:25:43.271 "w_mbytes_per_sec": 0 00:25:43.271 }, 00:25:43.271 "claimed": false, 00:25:43.271 "zoned": false, 00:25:43.271 "supported_io_types": { 00:25:43.271 "read": true, 00:25:43.271 "write": true, 00:25:43.271 "unmap": false, 00:25:43.271 "write_zeroes": true, 00:25:43.271 "flush": false, 00:25:43.271 "reset": false, 00:25:43.271 "compare": false, 00:25:43.271 "compare_and_write": false, 00:25:43.271 "abort": false, 00:25:43.271 "nvme_admin": false, 00:25:43.271 "nvme_io": false 00:25:43.271 }, 00:25:43.271 "driver_specific": { 00:25:43.271 "compress": { 00:25:43.271 "name": "COMP_lvs0/lv0", 00:25:43.271 "base_bdev_name": "b20cc108-ed31-4086-9705-7eaade5de2bd" 00:25:43.271 } 00:25:43.271 } 00:25:43.271 } 00:25:43.271 ] 00:25:43.271 13:53:57 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:25:43.271 13:53:57 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:43.271 [2024-06-10 13:53:57.714520] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f56901b1440 PMD being used: compress_qat 00:25:43.271 [2024-06-10 13:53:57.716196] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b886c0 PMD being used: compress_qat 00:25:43.271 Running I/O for 3 seconds... 00:25:46.564 00:25:46.564 Latency(us) 00:25:46.564 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:46.564 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:46.564 Verification LBA range: start 0x0 length 0x3100 00:25:46.564 COMP_lvs0/lv0 : 3.00 5932.58 23.17 0.00 0.00 5348.16 436.91 5625.17 00:25:46.564 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:46.564 Verification LBA range: start 0x3100 length 0x3100 00:25:46.564 COMP_lvs0/lv0 : 3.00 6104.67 23.85 0.00 0.00 5206.45 392.53 5570.56 00:25:46.564 =================================================================================================================== 00:25:46.564 Total : 12037.25 47.02 0.00 0.00 5276.29 392.53 5625.17 00:25:46.564 0 00:25:46.564 13:54:00 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:25:46.564 13:54:00 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:46.564 13:54:00 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:46.825 13:54:01 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:46.825 13:54:01 compress_compdev -- compress/compress.sh@78 -- # killprocess 1699736 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1699736 ']' 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1699736 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1699736 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1699736' 00:25:46.825 killing process with pid 1699736 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@968 -- # kill 1699736 00:25:46.825 Received shutdown signal, test time was about 3.000000 seconds 00:25:46.825 00:25:46.825 Latency(us) 00:25:46.825 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:46.825 =================================================================================================================== 00:25:46.825 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:46.825 13:54:01 compress_compdev -- common/autotest_common.sh@973 -- # wait 1699736 00:25:48.734 13:54:03 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:25:48.734 13:54:03 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:25:48.734 13:54:03 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1701532 00:25:48.734 13:54:03 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:48.734 13:54:03 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1701532 00:25:48.734 13:54:03 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:25:48.734 13:54:03 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1701532 ']' 00:25:48.734 13:54:03 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:48.734 13:54:03 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:48.734 13:54:03 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:48.734 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:48.734 13:54:03 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:48.734 13:54:03 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:48.734 [2024-06-10 13:54:03.205174] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:48.734 [2024-06-10 13:54:03.205228] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1701532 ] 00:25:48.993 [2024-06-10 13:54:03.275271] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:48.993 [2024-06-10 13:54:03.341634] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:25:48.993 [2024-06-10 13:54:03.341639] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:25:49.563 [2024-06-10 13:54:03.754278] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:49.822 13:54:04 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:49.822 13:54:04 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:25:49.823 13:54:04 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:25:49.823 13:54:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:49.823 13:54:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:50.392 [2024-06-10 13:54:04.564466] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2333130 PMD being used: compress_qat 00:25:50.392 13:54:04 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:50.392 13:54:04 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:50.652 [ 00:25:50.652 { 00:25:50.652 "name": "Nvme0n1", 00:25:50.652 "aliases": [ 00:25:50.652 "36344730-5260-5504-0025-3845000000bb" 00:25:50.652 ], 00:25:50.652 "product_name": "NVMe disk", 00:25:50.652 "block_size": 512, 00:25:50.652 "num_blocks": 3750748848, 00:25:50.652 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:25:50.652 "assigned_rate_limits": { 00:25:50.652 "rw_ios_per_sec": 0, 00:25:50.652 "rw_mbytes_per_sec": 0, 00:25:50.652 "r_mbytes_per_sec": 0, 00:25:50.652 "w_mbytes_per_sec": 0 00:25:50.652 }, 00:25:50.652 "claimed": false, 00:25:50.652 "zoned": false, 00:25:50.652 "supported_io_types": { 00:25:50.652 "read": true, 00:25:50.652 "write": true, 00:25:50.652 "unmap": true, 00:25:50.652 "write_zeroes": true, 00:25:50.652 "flush": true, 00:25:50.652 "reset": true, 00:25:50.652 "compare": true, 00:25:50.652 "compare_and_write": false, 00:25:50.652 "abort": true, 00:25:50.652 "nvme_admin": true, 00:25:50.652 "nvme_io": true 00:25:50.652 }, 00:25:50.652 "driver_specific": { 00:25:50.652 "nvme": [ 00:25:50.652 { 00:25:50.652 "pci_address": "0000:65:00.0", 00:25:50.652 "trid": { 00:25:50.652 "trtype": "PCIe", 00:25:50.652 "traddr": "0000:65:00.0" 00:25:50.652 }, 00:25:50.652 "ctrlr_data": { 00:25:50.652 "cntlid": 6, 00:25:50.652 "vendor_id": "0x144d", 00:25:50.652 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:25:50.652 "serial_number": "S64GNE0R605504", 00:25:50.652 "firmware_revision": "GDC5302Q", 00:25:50.652 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:25:50.652 "oacs": { 00:25:50.652 "security": 1, 00:25:50.652 "format": 1, 00:25:50.652 "firmware": 1, 00:25:50.652 "ns_manage": 1 00:25:50.652 }, 00:25:50.652 "multi_ctrlr": false, 00:25:50.652 "ana_reporting": false 00:25:50.652 }, 00:25:50.652 "vs": { 00:25:50.652 "nvme_version": "1.4" 00:25:50.652 }, 00:25:50.652 "ns_data": { 00:25:50.652 "id": 1, 00:25:50.652 "can_share": false 00:25:50.652 }, 00:25:50.652 "security": { 00:25:50.652 "opal": true 00:25:50.652 } 00:25:50.652 } 00:25:50.652 ], 00:25:50.652 "mp_policy": "active_passive" 00:25:50.652 } 00:25:50.652 } 00:25:50.652 ] 00:25:50.652 13:54:05 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:25:50.652 13:54:05 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:50.912 [2024-06-10 13:54:05.223989] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2181290 PMD being used: compress_qat 00:25:51.482 75961b74-5d94-4c44-b1a5-a102f897c921 00:25:51.482 13:54:05 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:51.741 6f3775e0-758e-4ce9-bb05-e437e4109132 00:25:51.741 13:54:06 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:51.741 13:54:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:25:51.741 13:54:06 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:51.741 13:54:06 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:51.741 13:54:06 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:51.741 13:54:06 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:51.741 13:54:06 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:52.001 13:54:06 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:52.001 [ 00:25:52.001 { 00:25:52.001 "name": "6f3775e0-758e-4ce9-bb05-e437e4109132", 00:25:52.001 "aliases": [ 00:25:52.001 "lvs0/lv0" 00:25:52.001 ], 00:25:52.001 "product_name": "Logical Volume", 00:25:52.001 "block_size": 512, 00:25:52.001 "num_blocks": 204800, 00:25:52.001 "uuid": "6f3775e0-758e-4ce9-bb05-e437e4109132", 00:25:52.001 "assigned_rate_limits": { 00:25:52.001 "rw_ios_per_sec": 0, 00:25:52.001 "rw_mbytes_per_sec": 0, 00:25:52.001 "r_mbytes_per_sec": 0, 00:25:52.001 "w_mbytes_per_sec": 0 00:25:52.001 }, 00:25:52.001 "claimed": false, 00:25:52.001 "zoned": false, 00:25:52.001 "supported_io_types": { 00:25:52.001 "read": true, 00:25:52.001 "write": true, 00:25:52.001 "unmap": true, 00:25:52.001 "write_zeroes": true, 00:25:52.001 "flush": false, 00:25:52.001 "reset": true, 00:25:52.001 "compare": false, 00:25:52.001 "compare_and_write": false, 00:25:52.001 "abort": false, 00:25:52.001 "nvme_admin": false, 00:25:52.001 "nvme_io": false 00:25:52.001 }, 00:25:52.001 "driver_specific": { 00:25:52.001 "lvol": { 00:25:52.001 "lvol_store_uuid": "75961b74-5d94-4c44-b1a5-a102f897c921", 00:25:52.001 "base_bdev": "Nvme0n1", 00:25:52.001 "thin_provision": true, 00:25:52.001 "num_allocated_clusters": 0, 00:25:52.001 "snapshot": false, 00:25:52.001 "clone": false, 00:25:52.001 "esnap_clone": false 00:25:52.001 } 00:25:52.001 } 00:25:52.001 } 00:25:52.001 ] 00:25:52.001 13:54:06 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:25:52.001 13:54:06 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:25:52.001 13:54:06 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:25:52.261 [2024-06-10 13:54:06.649376] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:52.261 COMP_lvs0/lv0 00:25:52.261 13:54:06 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:52.261 13:54:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:25:52.261 13:54:06 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:52.261 13:54:06 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:52.261 13:54:06 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:52.261 13:54:06 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:52.261 13:54:06 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:52.521 13:54:06 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:52.781 [ 00:25:52.781 { 00:25:52.781 "name": "COMP_lvs0/lv0", 00:25:52.781 "aliases": [ 00:25:52.781 "987f19db-d431-5fb4-a097-ec650966c314" 00:25:52.781 ], 00:25:52.781 "product_name": "compress", 00:25:52.781 "block_size": 512, 00:25:52.781 "num_blocks": 200704, 00:25:52.781 "uuid": "987f19db-d431-5fb4-a097-ec650966c314", 00:25:52.781 "assigned_rate_limits": { 00:25:52.781 "rw_ios_per_sec": 0, 00:25:52.781 "rw_mbytes_per_sec": 0, 00:25:52.781 "r_mbytes_per_sec": 0, 00:25:52.781 "w_mbytes_per_sec": 0 00:25:52.781 }, 00:25:52.781 "claimed": false, 00:25:52.781 "zoned": false, 00:25:52.781 "supported_io_types": { 00:25:52.781 "read": true, 00:25:52.781 "write": true, 00:25:52.781 "unmap": false, 00:25:52.781 "write_zeroes": true, 00:25:52.781 "flush": false, 00:25:52.781 "reset": false, 00:25:52.781 "compare": false, 00:25:52.781 "compare_and_write": false, 00:25:52.781 "abort": false, 00:25:52.781 "nvme_admin": false, 00:25:52.781 "nvme_io": false 00:25:52.781 }, 00:25:52.781 "driver_specific": { 00:25:52.781 "compress": { 00:25:52.781 "name": "COMP_lvs0/lv0", 00:25:52.781 "base_bdev_name": "6f3775e0-758e-4ce9-bb05-e437e4109132" 00:25:52.781 } 00:25:52.781 } 00:25:52.781 } 00:25:52.781 ] 00:25:52.781 13:54:07 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:25:52.781 13:54:07 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:52.781 [2024-06-10 13:54:07.194503] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f92f81b1440 PMD being used: compress_qat 00:25:52.781 [2024-06-10 13:54:07.196126] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21697c0 PMD being used: compress_qat 00:25:52.781 Running I/O for 3 seconds... 00:25:56.076 00:25:56.076 Latency(us) 00:25:56.076 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:56.076 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:56.076 Verification LBA range: start 0x0 length 0x3100 00:25:56.076 COMP_lvs0/lv0 : 3.00 5998.42 23.43 0.00 0.00 5289.83 481.28 4778.67 00:25:56.076 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:56.076 Verification LBA range: start 0x3100 length 0x3100 00:25:56.076 COMP_lvs0/lv0 : 3.00 6176.91 24.13 0.00 0.00 5147.13 353.28 4642.13 00:25:56.076 =================================================================================================================== 00:25:56.076 Total : 12175.34 47.56 0.00 0.00 5217.44 353.28 4778.67 00:25:56.076 0 00:25:56.076 13:54:10 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:25:56.076 13:54:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:56.076 13:54:10 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:56.336 13:54:10 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:56.336 13:54:10 compress_compdev -- compress/compress.sh@78 -- # killprocess 1701532 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1701532 ']' 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1701532 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1701532 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1701532' 00:25:56.336 killing process with pid 1701532 00:25:56.336 13:54:10 compress_compdev -- common/autotest_common.sh@968 -- # kill 1701532 00:25:56.336 Received shutdown signal, test time was about 3.000000 seconds 00:25:56.336 00:25:56.336 Latency(us) 00:25:56.336 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:56.337 =================================================================================================================== 00:25:56.337 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:56.337 13:54:10 compress_compdev -- common/autotest_common.sh@973 -- # wait 1701532 00:25:58.246 13:54:12 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:25:58.246 13:54:12 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:25:58.246 13:54:12 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1703968 00:25:58.246 13:54:12 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:58.246 13:54:12 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1703968 00:25:58.246 13:54:12 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:25:58.246 13:54:12 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1703968 ']' 00:25:58.246 13:54:12 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:58.246 13:54:12 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:58.246 13:54:12 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:58.246 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:58.246 13:54:12 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:58.246 13:54:12 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:58.246 [2024-06-10 13:54:12.716853] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:25:58.246 [2024-06-10 13:54:12.716904] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1703968 ] 00:25:58.506 [2024-06-10 13:54:12.787346] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:58.506 [2024-06-10 13:54:12.853359] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:25:58.506 [2024-06-10 13:54:12.853452] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:25:59.076 [2024-06-10 13:54:13.263728] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:59.336 13:54:13 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:59.336 13:54:13 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:25:59.336 13:54:13 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:25:59.336 13:54:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:59.336 13:54:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:59.906 [2024-06-10 13:54:14.076538] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1970130 PMD being used: compress_qat 00:25:59.906 13:54:14 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:59.906 13:54:14 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:00.166 [ 00:26:00.166 { 00:26:00.166 "name": "Nvme0n1", 00:26:00.166 "aliases": [ 00:26:00.166 "36344730-5260-5504-0025-3845000000bb" 00:26:00.166 ], 00:26:00.166 "product_name": "NVMe disk", 00:26:00.166 "block_size": 512, 00:26:00.166 "num_blocks": 3750748848, 00:26:00.166 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:26:00.166 "assigned_rate_limits": { 00:26:00.166 "rw_ios_per_sec": 0, 00:26:00.166 "rw_mbytes_per_sec": 0, 00:26:00.166 "r_mbytes_per_sec": 0, 00:26:00.166 "w_mbytes_per_sec": 0 00:26:00.166 }, 00:26:00.166 "claimed": false, 00:26:00.166 "zoned": false, 00:26:00.166 "supported_io_types": { 00:26:00.166 "read": true, 00:26:00.166 "write": true, 00:26:00.166 "unmap": true, 00:26:00.166 "write_zeroes": true, 00:26:00.166 "flush": true, 00:26:00.166 "reset": true, 00:26:00.166 "compare": true, 00:26:00.167 "compare_and_write": false, 00:26:00.167 "abort": true, 00:26:00.167 "nvme_admin": true, 00:26:00.167 "nvme_io": true 00:26:00.167 }, 00:26:00.167 "driver_specific": { 00:26:00.167 "nvme": [ 00:26:00.167 { 00:26:00.167 "pci_address": "0000:65:00.0", 00:26:00.167 "trid": { 00:26:00.167 "trtype": "PCIe", 00:26:00.167 "traddr": "0000:65:00.0" 00:26:00.167 }, 00:26:00.167 "ctrlr_data": { 00:26:00.167 "cntlid": 6, 00:26:00.167 "vendor_id": "0x144d", 00:26:00.167 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:26:00.167 "serial_number": "S64GNE0R605504", 00:26:00.167 "firmware_revision": "GDC5302Q", 00:26:00.167 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:26:00.167 "oacs": { 00:26:00.167 "security": 1, 00:26:00.167 "format": 1, 00:26:00.167 "firmware": 1, 00:26:00.167 "ns_manage": 1 00:26:00.167 }, 00:26:00.167 "multi_ctrlr": false, 00:26:00.167 "ana_reporting": false 00:26:00.167 }, 00:26:00.167 "vs": { 00:26:00.167 "nvme_version": "1.4" 00:26:00.167 }, 00:26:00.167 "ns_data": { 00:26:00.167 "id": 1, 00:26:00.167 "can_share": false 00:26:00.167 }, 00:26:00.167 "security": { 00:26:00.167 "opal": true 00:26:00.167 } 00:26:00.167 } 00:26:00.167 ], 00:26:00.167 "mp_policy": "active_passive" 00:26:00.167 } 00:26:00.167 } 00:26:00.167 ] 00:26:00.167 13:54:14 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:00.167 13:54:14 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:00.426 [2024-06-10 13:54:14.667903] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17be5a0 PMD being used: compress_qat 00:26:00.995 b50a01f2-df48-44c7-8628-ea2f51db2b92 00:26:00.995 13:54:15 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:00.995 f3182ef0-c021-4d72-b842-f544a1512038 00:26:01.254 13:54:15 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:01.254 13:54:15 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:01.515 [ 00:26:01.515 { 00:26:01.515 "name": "f3182ef0-c021-4d72-b842-f544a1512038", 00:26:01.515 "aliases": [ 00:26:01.515 "lvs0/lv0" 00:26:01.515 ], 00:26:01.515 "product_name": "Logical Volume", 00:26:01.515 "block_size": 512, 00:26:01.515 "num_blocks": 204800, 00:26:01.515 "uuid": "f3182ef0-c021-4d72-b842-f544a1512038", 00:26:01.515 "assigned_rate_limits": { 00:26:01.515 "rw_ios_per_sec": 0, 00:26:01.515 "rw_mbytes_per_sec": 0, 00:26:01.515 "r_mbytes_per_sec": 0, 00:26:01.515 "w_mbytes_per_sec": 0 00:26:01.515 }, 00:26:01.515 "claimed": false, 00:26:01.515 "zoned": false, 00:26:01.515 "supported_io_types": { 00:26:01.515 "read": true, 00:26:01.515 "write": true, 00:26:01.515 "unmap": true, 00:26:01.515 "write_zeroes": true, 00:26:01.515 "flush": false, 00:26:01.515 "reset": true, 00:26:01.515 "compare": false, 00:26:01.515 "compare_and_write": false, 00:26:01.515 "abort": false, 00:26:01.515 "nvme_admin": false, 00:26:01.515 "nvme_io": false 00:26:01.515 }, 00:26:01.515 "driver_specific": { 00:26:01.515 "lvol": { 00:26:01.515 "lvol_store_uuid": "b50a01f2-df48-44c7-8628-ea2f51db2b92", 00:26:01.515 "base_bdev": "Nvme0n1", 00:26:01.515 "thin_provision": true, 00:26:01.515 "num_allocated_clusters": 0, 00:26:01.515 "snapshot": false, 00:26:01.515 "clone": false, 00:26:01.515 "esnap_clone": false 00:26:01.515 } 00:26:01.515 } 00:26:01.515 } 00:26:01.515 ] 00:26:01.515 13:54:15 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:01.515 13:54:15 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:26:01.515 13:54:15 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:26:01.776 [2024-06-10 13:54:16.053135] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:01.776 COMP_lvs0/lv0 00:26:01.776 13:54:16 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:01.776 13:54:16 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:01.776 13:54:16 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:01.776 13:54:16 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:01.776 13:54:16 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:01.776 13:54:16 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:01.776 13:54:16 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:02.037 13:54:16 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:02.037 [ 00:26:02.037 { 00:26:02.037 "name": "COMP_lvs0/lv0", 00:26:02.037 "aliases": [ 00:26:02.037 "8fdc529e-57d3-55c8-8ea8-7661de494599" 00:26:02.037 ], 00:26:02.037 "product_name": "compress", 00:26:02.037 "block_size": 4096, 00:26:02.037 "num_blocks": 25088, 00:26:02.037 "uuid": "8fdc529e-57d3-55c8-8ea8-7661de494599", 00:26:02.037 "assigned_rate_limits": { 00:26:02.037 "rw_ios_per_sec": 0, 00:26:02.037 "rw_mbytes_per_sec": 0, 00:26:02.037 "r_mbytes_per_sec": 0, 00:26:02.037 "w_mbytes_per_sec": 0 00:26:02.037 }, 00:26:02.037 "claimed": false, 00:26:02.037 "zoned": false, 00:26:02.037 "supported_io_types": { 00:26:02.037 "read": true, 00:26:02.037 "write": true, 00:26:02.037 "unmap": false, 00:26:02.037 "write_zeroes": true, 00:26:02.037 "flush": false, 00:26:02.037 "reset": false, 00:26:02.037 "compare": false, 00:26:02.037 "compare_and_write": false, 00:26:02.037 "abort": false, 00:26:02.037 "nvme_admin": false, 00:26:02.037 "nvme_io": false 00:26:02.037 }, 00:26:02.037 "driver_specific": { 00:26:02.038 "compress": { 00:26:02.038 "name": "COMP_lvs0/lv0", 00:26:02.038 "base_bdev_name": "f3182ef0-c021-4d72-b842-f544a1512038" 00:26:02.038 } 00:26:02.038 } 00:26:02.038 } 00:26:02.038 ] 00:26:02.038 13:54:16 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:02.038 13:54:16 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:02.300 [2024-06-10 13:54:16.554145] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f022c1b1440 PMD being used: compress_qat 00:26:02.300 [2024-06-10 13:54:16.555856] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a6560 PMD being used: compress_qat 00:26:02.300 Running I/O for 3 seconds... 00:26:05.602 00:26:05.602 Latency(us) 00:26:05.602 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:05.602 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:05.602 Verification LBA range: start 0x0 length 0x3100 00:26:05.602 COMP_lvs0/lv0 : 3.00 5903.77 23.06 0.00 0.00 5373.29 453.97 4724.05 00:26:05.602 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:05.602 Verification LBA range: start 0x3100 length 0x3100 00:26:05.602 COMP_lvs0/lv0 : 3.00 6085.60 23.77 0.00 0.00 5222.70 375.47 4642.13 00:26:05.602 =================================================================================================================== 00:26:05.602 Total : 11989.37 46.83 0.00 0.00 5296.86 375.47 4724.05 00:26:05.602 0 00:26:05.602 13:54:19 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:26:05.602 13:54:19 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:05.602 13:54:19 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:05.602 13:54:19 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:05.602 13:54:19 compress_compdev -- compress/compress.sh@78 -- # killprocess 1703968 00:26:05.602 13:54:19 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1703968 ']' 00:26:05.602 13:54:19 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1703968 00:26:05.602 13:54:19 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:26:05.602 13:54:19 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:05.602 13:54:19 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1703968 00:26:05.602 13:54:20 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:05.602 13:54:20 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:05.602 13:54:20 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1703968' 00:26:05.602 killing process with pid 1703968 00:26:05.602 13:54:20 compress_compdev -- common/autotest_common.sh@968 -- # kill 1703968 00:26:05.602 Received shutdown signal, test time was about 3.000000 seconds 00:26:05.602 00:26:05.602 Latency(us) 00:26:05.602 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:05.602 =================================================================================================================== 00:26:05.602 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:05.602 13:54:20 compress_compdev -- common/autotest_common.sh@973 -- # wait 1703968 00:26:07.512 13:54:21 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:26:07.512 13:54:21 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:26:07.512 13:54:21 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1705644 00:26:07.512 13:54:21 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:07.512 13:54:21 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1705644 00:26:07.512 13:54:21 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:26:07.512 13:54:21 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 1705644 ']' 00:26:07.512 13:54:21 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:07.512 13:54:21 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:07.512 13:54:21 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:07.512 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:07.512 13:54:21 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:07.512 13:54:21 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:07.771 [2024-06-10 13:54:22.029019] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:07.771 [2024-06-10 13:54:22.029070] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1705644 ] 00:26:07.771 [2024-06-10 13:54:22.114505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:07.772 [2024-06-10 13:54:22.184387] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:26:07.772 [2024-06-10 13:54:22.184560] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:26:07.772 [2024-06-10 13:54:22.184565] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:26:08.340 [2024-06-10 13:54:22.601170] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:26:08.600 13:54:22 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:08.600 13:54:22 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:26:08.600 13:54:22 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:26:08.600 13:54:22 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:08.600 13:54:22 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:09.169 [2024-06-10 13:54:23.389168] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfd1ac0 PMD being used: compress_qat 00:26:09.169 13:54:23 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:09.169 13:54:23 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:09.429 [ 00:26:09.429 { 00:26:09.429 "name": "Nvme0n1", 00:26:09.429 "aliases": [ 00:26:09.429 "36344730-5260-5504-0025-3845000000bb" 00:26:09.429 ], 00:26:09.429 "product_name": "NVMe disk", 00:26:09.429 "block_size": 512, 00:26:09.429 "num_blocks": 3750748848, 00:26:09.429 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:26:09.429 "assigned_rate_limits": { 00:26:09.429 "rw_ios_per_sec": 0, 00:26:09.429 "rw_mbytes_per_sec": 0, 00:26:09.429 "r_mbytes_per_sec": 0, 00:26:09.429 "w_mbytes_per_sec": 0 00:26:09.429 }, 00:26:09.429 "claimed": false, 00:26:09.429 "zoned": false, 00:26:09.429 "supported_io_types": { 00:26:09.429 "read": true, 00:26:09.429 "write": true, 00:26:09.429 "unmap": true, 00:26:09.429 "write_zeroes": true, 00:26:09.429 "flush": true, 00:26:09.429 "reset": true, 00:26:09.429 "compare": true, 00:26:09.429 "compare_and_write": false, 00:26:09.429 "abort": true, 00:26:09.429 "nvme_admin": true, 00:26:09.429 "nvme_io": true 00:26:09.429 }, 00:26:09.429 "driver_specific": { 00:26:09.429 "nvme": [ 00:26:09.429 { 00:26:09.429 "pci_address": "0000:65:00.0", 00:26:09.429 "trid": { 00:26:09.429 "trtype": "PCIe", 00:26:09.429 "traddr": "0000:65:00.0" 00:26:09.429 }, 00:26:09.429 "ctrlr_data": { 00:26:09.429 "cntlid": 6, 00:26:09.429 "vendor_id": "0x144d", 00:26:09.429 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:26:09.429 "serial_number": "S64GNE0R605504", 00:26:09.429 "firmware_revision": "GDC5302Q", 00:26:09.429 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:26:09.429 "oacs": { 00:26:09.429 "security": 1, 00:26:09.429 "format": 1, 00:26:09.429 "firmware": 1, 00:26:09.429 "ns_manage": 1 00:26:09.429 }, 00:26:09.429 "multi_ctrlr": false, 00:26:09.429 "ana_reporting": false 00:26:09.429 }, 00:26:09.429 "vs": { 00:26:09.429 "nvme_version": "1.4" 00:26:09.429 }, 00:26:09.429 "ns_data": { 00:26:09.429 "id": 1, 00:26:09.429 "can_share": false 00:26:09.429 }, 00:26:09.429 "security": { 00:26:09.429 "opal": true 00:26:09.429 } 00:26:09.429 } 00:26:09.429 ], 00:26:09.429 "mp_policy": "active_passive" 00:26:09.429 } 00:26:09.429 } 00:26:09.429 ] 00:26:09.429 13:54:23 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:09.429 13:54:23 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:09.688 [2024-06-10 13:54:23.980624] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfd29c0 PMD being used: compress_qat 00:26:10.257 7eec9a40-1e0e-4eb5-8448-4bb5da90ac3f 00:26:10.257 13:54:24 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:10.517 d310243e-503d-4733-b3f7-e528b0ee96c8 00:26:10.517 13:54:24 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:10.517 13:54:24 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:10.517 13:54:24 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:10.517 13:54:24 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:10.517 13:54:24 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:10.517 13:54:24 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:10.517 13:54:24 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:10.777 13:54:25 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:11.037 [ 00:26:11.037 { 00:26:11.037 "name": "d310243e-503d-4733-b3f7-e528b0ee96c8", 00:26:11.037 "aliases": [ 00:26:11.037 "lvs0/lv0" 00:26:11.037 ], 00:26:11.037 "product_name": "Logical Volume", 00:26:11.037 "block_size": 512, 00:26:11.037 "num_blocks": 204800, 00:26:11.037 "uuid": "d310243e-503d-4733-b3f7-e528b0ee96c8", 00:26:11.037 "assigned_rate_limits": { 00:26:11.037 "rw_ios_per_sec": 0, 00:26:11.037 "rw_mbytes_per_sec": 0, 00:26:11.037 "r_mbytes_per_sec": 0, 00:26:11.037 "w_mbytes_per_sec": 0 00:26:11.037 }, 00:26:11.037 "claimed": false, 00:26:11.037 "zoned": false, 00:26:11.037 "supported_io_types": { 00:26:11.037 "read": true, 00:26:11.037 "write": true, 00:26:11.037 "unmap": true, 00:26:11.037 "write_zeroes": true, 00:26:11.037 "flush": false, 00:26:11.037 "reset": true, 00:26:11.037 "compare": false, 00:26:11.037 "compare_and_write": false, 00:26:11.037 "abort": false, 00:26:11.037 "nvme_admin": false, 00:26:11.037 "nvme_io": false 00:26:11.037 }, 00:26:11.037 "driver_specific": { 00:26:11.037 "lvol": { 00:26:11.037 "lvol_store_uuid": "7eec9a40-1e0e-4eb5-8448-4bb5da90ac3f", 00:26:11.037 "base_bdev": "Nvme0n1", 00:26:11.037 "thin_provision": true, 00:26:11.037 "num_allocated_clusters": 0, 00:26:11.037 "snapshot": false, 00:26:11.037 "clone": false, 00:26:11.037 "esnap_clone": false 00:26:11.037 } 00:26:11.037 } 00:26:11.037 } 00:26:11.037 ] 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:11.037 13:54:25 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:11.037 13:54:25 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:11.037 [2024-06-10 13:54:25.434176] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:11.037 COMP_lvs0/lv0 00:26:11.037 13:54:25 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:11.037 13:54:25 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:11.298 13:54:25 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:11.607 [ 00:26:11.607 { 00:26:11.607 "name": "COMP_lvs0/lv0", 00:26:11.607 "aliases": [ 00:26:11.607 "caab742d-ed8b-5a2d-9220-d34b9b0d13b1" 00:26:11.607 ], 00:26:11.607 "product_name": "compress", 00:26:11.607 "block_size": 512, 00:26:11.607 "num_blocks": 200704, 00:26:11.607 "uuid": "caab742d-ed8b-5a2d-9220-d34b9b0d13b1", 00:26:11.607 "assigned_rate_limits": { 00:26:11.607 "rw_ios_per_sec": 0, 00:26:11.607 "rw_mbytes_per_sec": 0, 00:26:11.607 "r_mbytes_per_sec": 0, 00:26:11.607 "w_mbytes_per_sec": 0 00:26:11.607 }, 00:26:11.607 "claimed": false, 00:26:11.607 "zoned": false, 00:26:11.607 "supported_io_types": { 00:26:11.607 "read": true, 00:26:11.607 "write": true, 00:26:11.607 "unmap": false, 00:26:11.607 "write_zeroes": true, 00:26:11.607 "flush": false, 00:26:11.607 "reset": false, 00:26:11.607 "compare": false, 00:26:11.607 "compare_and_write": false, 00:26:11.607 "abort": false, 00:26:11.607 "nvme_admin": false, 00:26:11.607 "nvme_io": false 00:26:11.607 }, 00:26:11.607 "driver_specific": { 00:26:11.607 "compress": { 00:26:11.607 "name": "COMP_lvs0/lv0", 00:26:11.607 "base_bdev_name": "d310243e-503d-4733-b3f7-e528b0ee96c8" 00:26:11.607 } 00:26:11.607 } 00:26:11.607 } 00:26:11.607 ] 00:26:11.607 13:54:25 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:26:11.607 13:54:25 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:11.607 [2024-06-10 13:54:25.994573] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc5d01b11d0 PMD being used: compress_qat 00:26:11.607 I/O targets: 00:26:11.607 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:26:11.607 00:26:11.607 00:26:11.607 CUnit - A unit testing framework for C - Version 2.1-3 00:26:11.607 http://cunit.sourceforge.net/ 00:26:11.607 00:26:11.607 00:26:11.607 Suite: bdevio tests on: COMP_lvs0/lv0 00:26:11.607 Test: blockdev write read block ...passed 00:26:11.607 Test: blockdev write zeroes read block ...passed 00:26:11.608 Test: blockdev write zeroes read no split ...passed 00:26:11.608 Test: blockdev write zeroes read split ...passed 00:26:11.608 Test: blockdev write zeroes read split partial ...passed 00:26:11.608 Test: blockdev reset ...[2024-06-10 13:54:26.027248] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:26:11.608 passed 00:26:11.608 Test: blockdev write read 8 blocks ...passed 00:26:11.608 Test: blockdev write read size > 128k ...passed 00:26:11.608 Test: blockdev write read invalid size ...passed 00:26:11.608 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:11.608 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:11.608 Test: blockdev write read max offset ...passed 00:26:11.608 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:11.608 Test: blockdev writev readv 8 blocks ...passed 00:26:11.608 Test: blockdev writev readv 30 x 1block ...passed 00:26:11.608 Test: blockdev writev readv block ...passed 00:26:11.608 Test: blockdev writev readv size > 128k ...passed 00:26:11.608 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:11.608 Test: blockdev comparev and writev ...passed 00:26:11.608 Test: blockdev nvme passthru rw ...passed 00:26:11.608 Test: blockdev nvme passthru vendor specific ...passed 00:26:11.608 Test: blockdev nvme admin passthru ...passed 00:26:11.608 Test: blockdev copy ...passed 00:26:11.608 00:26:11.608 Run Summary: Type Total Ran Passed Failed Inactive 00:26:11.608 suites 1 1 n/a 0 0 00:26:11.608 tests 23 23 23 0 0 00:26:11.608 asserts 130 130 130 0 n/a 00:26:11.608 00:26:11.608 Elapsed time = 0.083 seconds 00:26:11.608 0 00:26:11.608 13:54:26 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:26:11.608 13:54:26 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:11.907 13:54:26 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:12.168 13:54:26 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:26:12.168 13:54:26 compress_compdev -- compress/compress.sh@62 -- # killprocess 1705644 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 1705644 ']' 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 1705644 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1705644 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1705644' 00:26:12.168 killing process with pid 1705644 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@968 -- # kill 1705644 00:26:12.168 13:54:26 compress_compdev -- common/autotest_common.sh@973 -- # wait 1705644 00:26:14.080 13:54:28 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:26:14.080 13:54:28 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:26:14.080 00:26:14.080 real 0m34.953s 00:26:14.080 user 1m22.319s 00:26:14.080 sys 0m3.622s 00:26:14.080 13:54:28 compress_compdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:14.080 13:54:28 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:26:14.080 ************************************ 00:26:14.080 END TEST compress_compdev 00:26:14.080 ************************************ 00:26:14.080 13:54:28 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:26:14.341 13:54:28 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:14.341 13:54:28 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:14.341 13:54:28 -- common/autotest_common.sh@10 -- # set +x 00:26:14.341 ************************************ 00:26:14.341 START TEST compress_isal 00:26:14.341 ************************************ 00:26:14.341 13:54:28 compress_isal -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:26:14.341 * Looking for test storage... 00:26:14.341 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:26:14.341 13:54:28 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:26:14.341 13:54:28 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:26:14.341 13:54:28 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:26:14.341 13:54:28 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:26:14.341 13:54:28 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:806f5428-4aec-ec11-9bc7-a4bf01928306 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=806f5428-4aec-ec11-9bc7-a4bf01928306 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:26:14.342 13:54:28 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:26:14.342 13:54:28 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:26:14.342 13:54:28 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:26:14.342 13:54:28 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:14.342 13:54:28 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:14.342 13:54:28 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:14.342 13:54:28 compress_isal -- paths/export.sh@5 -- # export PATH 00:26:14.342 13:54:28 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@47 -- # : 0 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:26:14.342 13:54:28 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1707046 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1707046 00:26:14.342 13:54:28 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1707046 ']' 00:26:14.342 13:54:28 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:14.342 13:54:28 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:26:14.342 13:54:28 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:14.342 13:54:28 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:14.342 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:14.342 13:54:28 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:14.342 13:54:28 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:14.342 [2024-06-10 13:54:28.789027] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:14.342 [2024-06-10 13:54:28.789094] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1707046 ] 00:26:14.602 [2024-06-10 13:54:28.867748] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:14.602 [2024-06-10 13:54:28.940703] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:26:14.602 [2024-06-10 13:54:28.940708] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:26:15.543 13:54:29 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:15.543 13:54:29 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:26:15.543 13:54:29 compress_isal -- compress/compress.sh@74 -- # create_vols 00:26:15.543 13:54:29 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:15.543 13:54:29 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:15.803 13:54:30 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:15.803 13:54:30 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:15.803 13:54:30 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:15.803 13:54:30 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:15.803 13:54:30 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:15.803 13:54:30 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:15.803 13:54:30 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:16.063 13:54:30 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:16.325 [ 00:26:16.325 { 00:26:16.325 "name": "Nvme0n1", 00:26:16.325 "aliases": [ 00:26:16.325 "36344730-5260-5504-0025-3845000000bb" 00:26:16.325 ], 00:26:16.325 "product_name": "NVMe disk", 00:26:16.325 "block_size": 512, 00:26:16.325 "num_blocks": 3750748848, 00:26:16.325 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:26:16.325 "assigned_rate_limits": { 00:26:16.325 "rw_ios_per_sec": 0, 00:26:16.325 "rw_mbytes_per_sec": 0, 00:26:16.325 "r_mbytes_per_sec": 0, 00:26:16.325 "w_mbytes_per_sec": 0 00:26:16.325 }, 00:26:16.325 "claimed": false, 00:26:16.325 "zoned": false, 00:26:16.325 "supported_io_types": { 00:26:16.325 "read": true, 00:26:16.325 "write": true, 00:26:16.325 "unmap": true, 00:26:16.325 "write_zeroes": true, 00:26:16.325 "flush": true, 00:26:16.325 "reset": true, 00:26:16.325 "compare": true, 00:26:16.325 "compare_and_write": false, 00:26:16.325 "abort": true, 00:26:16.325 "nvme_admin": true, 00:26:16.325 "nvme_io": true 00:26:16.325 }, 00:26:16.325 "driver_specific": { 00:26:16.325 "nvme": [ 00:26:16.325 { 00:26:16.325 "pci_address": "0000:65:00.0", 00:26:16.325 "trid": { 00:26:16.325 "trtype": "PCIe", 00:26:16.325 "traddr": "0000:65:00.0" 00:26:16.325 }, 00:26:16.325 "ctrlr_data": { 00:26:16.325 "cntlid": 6, 00:26:16.325 "vendor_id": "0x144d", 00:26:16.325 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:26:16.325 "serial_number": "S64GNE0R605504", 00:26:16.325 "firmware_revision": "GDC5302Q", 00:26:16.325 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:26:16.325 "oacs": { 00:26:16.325 "security": 1, 00:26:16.325 "format": 1, 00:26:16.325 "firmware": 1, 00:26:16.325 "ns_manage": 1 00:26:16.325 }, 00:26:16.325 "multi_ctrlr": false, 00:26:16.325 "ana_reporting": false 00:26:16.325 }, 00:26:16.325 "vs": { 00:26:16.325 "nvme_version": "1.4" 00:26:16.325 }, 00:26:16.325 "ns_data": { 00:26:16.325 "id": 1, 00:26:16.325 "can_share": false 00:26:16.325 }, 00:26:16.325 "security": { 00:26:16.325 "opal": true 00:26:16.325 } 00:26:16.325 } 00:26:16.325 ], 00:26:16.325 "mp_policy": "active_passive" 00:26:16.325 } 00:26:16.325 } 00:26:16.325 ] 00:26:16.325 13:54:30 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:16.325 13:54:30 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:17.268 bc77981a-4603-4ea7-a834-241bbdb14edd 00:26:17.268 13:54:31 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:17.268 5820de11-bb1b-498f-93cb-ac34b63ea39e 00:26:17.268 13:54:31 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:17.268 13:54:31 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:17.268 13:54:31 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:17.268 13:54:31 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:17.268 13:54:31 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:17.268 13:54:31 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:17.268 13:54:31 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:17.528 13:54:31 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:17.788 [ 00:26:17.788 { 00:26:17.788 "name": "5820de11-bb1b-498f-93cb-ac34b63ea39e", 00:26:17.788 "aliases": [ 00:26:17.788 "lvs0/lv0" 00:26:17.788 ], 00:26:17.788 "product_name": "Logical Volume", 00:26:17.788 "block_size": 512, 00:26:17.788 "num_blocks": 204800, 00:26:17.788 "uuid": "5820de11-bb1b-498f-93cb-ac34b63ea39e", 00:26:17.788 "assigned_rate_limits": { 00:26:17.788 "rw_ios_per_sec": 0, 00:26:17.788 "rw_mbytes_per_sec": 0, 00:26:17.789 "r_mbytes_per_sec": 0, 00:26:17.789 "w_mbytes_per_sec": 0 00:26:17.789 }, 00:26:17.789 "claimed": false, 00:26:17.789 "zoned": false, 00:26:17.789 "supported_io_types": { 00:26:17.789 "read": true, 00:26:17.789 "write": true, 00:26:17.789 "unmap": true, 00:26:17.789 "write_zeroes": true, 00:26:17.789 "flush": false, 00:26:17.789 "reset": true, 00:26:17.789 "compare": false, 00:26:17.789 "compare_and_write": false, 00:26:17.789 "abort": false, 00:26:17.789 "nvme_admin": false, 00:26:17.789 "nvme_io": false 00:26:17.789 }, 00:26:17.789 "driver_specific": { 00:26:17.789 "lvol": { 00:26:17.789 "lvol_store_uuid": "bc77981a-4603-4ea7-a834-241bbdb14edd", 00:26:17.789 "base_bdev": "Nvme0n1", 00:26:17.789 "thin_provision": true, 00:26:17.789 "num_allocated_clusters": 0, 00:26:17.789 "snapshot": false, 00:26:17.789 "clone": false, 00:26:17.789 "esnap_clone": false 00:26:17.789 } 00:26:17.789 } 00:26:17.789 } 00:26:17.789 ] 00:26:17.789 13:54:32 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:17.789 13:54:32 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:17.789 13:54:32 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:18.048 [2024-06-10 13:54:32.317386] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:18.048 COMP_lvs0/lv0 00:26:18.048 13:54:32 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:18.048 13:54:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:18.048 13:54:32 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:18.048 13:54:32 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:18.049 13:54:32 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:18.049 13:54:32 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:18.049 13:54:32 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:18.308 13:54:32 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:18.308 [ 00:26:18.308 { 00:26:18.308 "name": "COMP_lvs0/lv0", 00:26:18.308 "aliases": [ 00:26:18.308 "384c43f3-29d9-5d16-ba20-23d0a137f155" 00:26:18.308 ], 00:26:18.308 "product_name": "compress", 00:26:18.308 "block_size": 512, 00:26:18.308 "num_blocks": 200704, 00:26:18.308 "uuid": "384c43f3-29d9-5d16-ba20-23d0a137f155", 00:26:18.308 "assigned_rate_limits": { 00:26:18.308 "rw_ios_per_sec": 0, 00:26:18.308 "rw_mbytes_per_sec": 0, 00:26:18.308 "r_mbytes_per_sec": 0, 00:26:18.308 "w_mbytes_per_sec": 0 00:26:18.308 }, 00:26:18.308 "claimed": false, 00:26:18.308 "zoned": false, 00:26:18.308 "supported_io_types": { 00:26:18.308 "read": true, 00:26:18.308 "write": true, 00:26:18.308 "unmap": false, 00:26:18.308 "write_zeroes": true, 00:26:18.308 "flush": false, 00:26:18.308 "reset": false, 00:26:18.308 "compare": false, 00:26:18.308 "compare_and_write": false, 00:26:18.308 "abort": false, 00:26:18.308 "nvme_admin": false, 00:26:18.308 "nvme_io": false 00:26:18.308 }, 00:26:18.308 "driver_specific": { 00:26:18.308 "compress": { 00:26:18.308 "name": "COMP_lvs0/lv0", 00:26:18.308 "base_bdev_name": "5820de11-bb1b-498f-93cb-ac34b63ea39e" 00:26:18.308 } 00:26:18.308 } 00:26:18.308 } 00:26:18.308 ] 00:26:18.308 13:54:32 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:18.308 13:54:32 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:18.567 Running I/O for 3 seconds... 00:26:21.867 00:26:21.867 Latency(us) 00:26:21.867 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:21.867 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:21.867 Verification LBA range: start 0x0 length 0x3100 00:26:21.867 COMP_lvs0/lv0 : 3.00 4279.29 16.72 0.00 0.00 7425.75 709.97 6335.15 00:26:21.867 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:21.867 Verification LBA range: start 0x3100 length 0x3100 00:26:21.867 COMP_lvs0/lv0 : 3.00 4285.90 16.74 0.00 0.00 7424.45 559.79 6307.84 00:26:21.867 =================================================================================================================== 00:26:21.867 Total : 8565.19 33.46 0.00 0.00 7425.10 559.79 6335.15 00:26:21.867 0 00:26:21.867 13:54:35 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:26:21.867 13:54:35 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:21.867 13:54:36 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:21.867 13:54:36 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:21.867 13:54:36 compress_isal -- compress/compress.sh@78 -- # killprocess 1707046 00:26:21.867 13:54:36 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1707046 ']' 00:26:21.867 13:54:36 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1707046 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@954 -- # uname 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1707046 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1707046' 00:26:22.129 killing process with pid 1707046 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@968 -- # kill 1707046 00:26:22.129 Received shutdown signal, test time was about 3.000000 seconds 00:26:22.129 00:26:22.129 Latency(us) 00:26:22.129 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:22.129 =================================================================================================================== 00:26:22.129 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:22.129 13:54:36 compress_isal -- common/autotest_common.sh@973 -- # wait 1707046 00:26:24.044 13:54:38 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:26:24.044 13:54:38 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:24.044 13:54:38 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1708900 00:26:24.044 13:54:38 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:24.044 13:54:38 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1708900 00:26:24.044 13:54:38 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:26:24.044 13:54:38 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1708900 ']' 00:26:24.044 13:54:38 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:24.044 13:54:38 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:24.044 13:54:38 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:24.044 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:24.044 13:54:38 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:24.044 13:54:38 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:24.044 [2024-06-10 13:54:38.402787] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:24.044 [2024-06-10 13:54:38.402847] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1708900 ] 00:26:24.044 [2024-06-10 13:54:38.474837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:24.305 [2024-06-10 13:54:38.541263] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:26:24.305 [2024-06-10 13:54:38.541411] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:26:24.876 13:54:39 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:24.876 13:54:39 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:26:24.876 13:54:39 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:26:24.876 13:54:39 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:24.877 13:54:39 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:25.448 13:54:39 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:25.448 13:54:39 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:25.448 13:54:39 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:25.448 13:54:39 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:25.448 13:54:39 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:25.448 13:54:39 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:25.448 13:54:39 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:25.708 13:54:40 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:25.968 [ 00:26:25.968 { 00:26:25.968 "name": "Nvme0n1", 00:26:25.968 "aliases": [ 00:26:25.968 "36344730-5260-5504-0025-3845000000bb" 00:26:25.968 ], 00:26:25.968 "product_name": "NVMe disk", 00:26:25.968 "block_size": 512, 00:26:25.968 "num_blocks": 3750748848, 00:26:25.968 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:26:25.968 "assigned_rate_limits": { 00:26:25.968 "rw_ios_per_sec": 0, 00:26:25.968 "rw_mbytes_per_sec": 0, 00:26:25.968 "r_mbytes_per_sec": 0, 00:26:25.968 "w_mbytes_per_sec": 0 00:26:25.968 }, 00:26:25.968 "claimed": false, 00:26:25.968 "zoned": false, 00:26:25.968 "supported_io_types": { 00:26:25.968 "read": true, 00:26:25.968 "write": true, 00:26:25.968 "unmap": true, 00:26:25.968 "write_zeroes": true, 00:26:25.968 "flush": true, 00:26:25.968 "reset": true, 00:26:25.968 "compare": true, 00:26:25.968 "compare_and_write": false, 00:26:25.968 "abort": true, 00:26:25.968 "nvme_admin": true, 00:26:25.968 "nvme_io": true 00:26:25.968 }, 00:26:25.968 "driver_specific": { 00:26:25.968 "nvme": [ 00:26:25.968 { 00:26:25.968 "pci_address": "0000:65:00.0", 00:26:25.968 "trid": { 00:26:25.968 "trtype": "PCIe", 00:26:25.968 "traddr": "0000:65:00.0" 00:26:25.969 }, 00:26:25.969 "ctrlr_data": { 00:26:25.969 "cntlid": 6, 00:26:25.969 "vendor_id": "0x144d", 00:26:25.969 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:26:25.969 "serial_number": "S64GNE0R605504", 00:26:25.969 "firmware_revision": "GDC5302Q", 00:26:25.969 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:26:25.969 "oacs": { 00:26:25.969 "security": 1, 00:26:25.969 "format": 1, 00:26:25.969 "firmware": 1, 00:26:25.969 "ns_manage": 1 00:26:25.969 }, 00:26:25.969 "multi_ctrlr": false, 00:26:25.969 "ana_reporting": false 00:26:25.969 }, 00:26:25.969 "vs": { 00:26:25.969 "nvme_version": "1.4" 00:26:25.969 }, 00:26:25.969 "ns_data": { 00:26:25.969 "id": 1, 00:26:25.969 "can_share": false 00:26:25.969 }, 00:26:25.969 "security": { 00:26:25.969 "opal": true 00:26:25.969 } 00:26:25.969 } 00:26:25.969 ], 00:26:25.969 "mp_policy": "active_passive" 00:26:25.969 } 00:26:25.969 } 00:26:25.969 ] 00:26:25.969 13:54:40 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:25.969 13:54:40 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:26.540 792bc86e-c61b-40e5-9c1f-66a14d4a5e6b 00:26:26.801 13:54:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:26.801 593c4a99-7291-4015-bf27-53be62992fee 00:26:26.801 13:54:41 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:26.801 13:54:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:26.801 13:54:41 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:26.801 13:54:41 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:26.801 13:54:41 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:26.801 13:54:41 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:26.801 13:54:41 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:27.062 13:54:41 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:27.322 [ 00:26:27.322 { 00:26:27.322 "name": "593c4a99-7291-4015-bf27-53be62992fee", 00:26:27.322 "aliases": [ 00:26:27.322 "lvs0/lv0" 00:26:27.323 ], 00:26:27.323 "product_name": "Logical Volume", 00:26:27.323 "block_size": 512, 00:26:27.323 "num_blocks": 204800, 00:26:27.323 "uuid": "593c4a99-7291-4015-bf27-53be62992fee", 00:26:27.323 "assigned_rate_limits": { 00:26:27.323 "rw_ios_per_sec": 0, 00:26:27.323 "rw_mbytes_per_sec": 0, 00:26:27.323 "r_mbytes_per_sec": 0, 00:26:27.323 "w_mbytes_per_sec": 0 00:26:27.323 }, 00:26:27.323 "claimed": false, 00:26:27.323 "zoned": false, 00:26:27.323 "supported_io_types": { 00:26:27.323 "read": true, 00:26:27.323 "write": true, 00:26:27.323 "unmap": true, 00:26:27.323 "write_zeroes": true, 00:26:27.323 "flush": false, 00:26:27.323 "reset": true, 00:26:27.323 "compare": false, 00:26:27.323 "compare_and_write": false, 00:26:27.323 "abort": false, 00:26:27.323 "nvme_admin": false, 00:26:27.323 "nvme_io": false 00:26:27.323 }, 00:26:27.323 "driver_specific": { 00:26:27.323 "lvol": { 00:26:27.323 "lvol_store_uuid": "792bc86e-c61b-40e5-9c1f-66a14d4a5e6b", 00:26:27.323 "base_bdev": "Nvme0n1", 00:26:27.323 "thin_provision": true, 00:26:27.323 "num_allocated_clusters": 0, 00:26:27.323 "snapshot": false, 00:26:27.323 "clone": false, 00:26:27.323 "esnap_clone": false 00:26:27.323 } 00:26:27.323 } 00:26:27.323 } 00:26:27.323 ] 00:26:27.323 13:54:41 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:27.323 13:54:41 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:26:27.323 13:54:41 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:26:27.584 [2024-06-10 13:54:41.857071] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:27.584 COMP_lvs0/lv0 00:26:27.584 13:54:41 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:27.584 13:54:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:27.584 13:54:41 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:27.584 13:54:41 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:27.584 13:54:41 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:27.584 13:54:41 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:27.584 13:54:41 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:27.845 13:54:42 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:27.845 [ 00:26:27.845 { 00:26:27.845 "name": "COMP_lvs0/lv0", 00:26:27.845 "aliases": [ 00:26:27.845 "02afb232-111f-57b7-bd9c-6ff5247c8b2b" 00:26:27.845 ], 00:26:27.845 "product_name": "compress", 00:26:27.845 "block_size": 512, 00:26:27.845 "num_blocks": 200704, 00:26:27.845 "uuid": "02afb232-111f-57b7-bd9c-6ff5247c8b2b", 00:26:27.845 "assigned_rate_limits": { 00:26:27.845 "rw_ios_per_sec": 0, 00:26:27.845 "rw_mbytes_per_sec": 0, 00:26:27.845 "r_mbytes_per_sec": 0, 00:26:27.845 "w_mbytes_per_sec": 0 00:26:27.845 }, 00:26:27.845 "claimed": false, 00:26:27.845 "zoned": false, 00:26:27.845 "supported_io_types": { 00:26:27.845 "read": true, 00:26:27.845 "write": true, 00:26:27.845 "unmap": false, 00:26:27.845 "write_zeroes": true, 00:26:27.845 "flush": false, 00:26:27.845 "reset": false, 00:26:27.845 "compare": false, 00:26:27.845 "compare_and_write": false, 00:26:27.845 "abort": false, 00:26:27.845 "nvme_admin": false, 00:26:27.845 "nvme_io": false 00:26:27.845 }, 00:26:27.845 "driver_specific": { 00:26:27.845 "compress": { 00:26:27.845 "name": "COMP_lvs0/lv0", 00:26:27.845 "base_bdev_name": "593c4a99-7291-4015-bf27-53be62992fee" 00:26:27.845 } 00:26:27.845 } 00:26:27.845 } 00:26:27.845 ] 00:26:28.104 13:54:42 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:28.104 13:54:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:28.104 Running I/O for 3 seconds... 00:26:31.404 00:26:31.404 Latency(us) 00:26:31.404 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:31.404 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:31.404 Verification LBA range: start 0x0 length 0x3100 00:26:31.404 COMP_lvs0/lv0 : 3.00 4255.27 16.62 0.00 0.00 7467.75 590.51 6280.53 00:26:31.405 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:31.405 Verification LBA range: start 0x3100 length 0x3100 00:26:31.405 COMP_lvs0/lv0 : 3.00 4261.27 16.65 0.00 0.00 7466.02 423.25 6225.92 00:26:31.405 =================================================================================================================== 00:26:31.405 Total : 8516.54 33.27 0.00 0.00 7466.88 423.25 6280.53 00:26:31.405 0 00:26:31.405 13:54:45 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:26:31.405 13:54:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:31.405 13:54:45 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:31.666 13:54:45 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:31.666 13:54:45 compress_isal -- compress/compress.sh@78 -- # killprocess 1708900 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1708900 ']' 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1708900 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@954 -- # uname 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1708900 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1708900' 00:26:31.666 killing process with pid 1708900 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@968 -- # kill 1708900 00:26:31.666 Received shutdown signal, test time was about 3.000000 seconds 00:26:31.666 00:26:31.666 Latency(us) 00:26:31.666 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:31.666 =================================================================================================================== 00:26:31.666 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:31.666 13:54:45 compress_isal -- common/autotest_common.sh@973 -- # wait 1708900 00:26:33.580 13:54:47 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:26:33.580 13:54:47 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:33.580 13:54:47 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1710724 00:26:33.580 13:54:47 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:33.580 13:54:47 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1710724 00:26:33.580 13:54:47 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:26:33.580 13:54:47 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1710724 ']' 00:26:33.580 13:54:47 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:33.580 13:54:47 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:33.580 13:54:47 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:33.580 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:33.580 13:54:47 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:33.580 13:54:47 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:33.580 [2024-06-10 13:54:47.949729] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:33.580 [2024-06-10 13:54:47.949783] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1710724 ] 00:26:33.580 [2024-06-10 13:54:48.020928] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:33.841 [2024-06-10 13:54:48.086045] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:26:33.841 [2024-06-10 13:54:48.086050] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:26:34.411 13:54:48 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:34.411 13:54:48 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:26:34.411 13:54:48 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:26:34.411 13:54:48 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:34.411 13:54:48 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:34.983 13:54:49 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:34.983 13:54:49 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:34.983 13:54:49 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:34.983 13:54:49 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:34.983 13:54:49 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:34.983 13:54:49 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:34.983 13:54:49 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:35.244 13:54:49 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:35.505 [ 00:26:35.505 { 00:26:35.505 "name": "Nvme0n1", 00:26:35.505 "aliases": [ 00:26:35.505 "36344730-5260-5504-0025-3845000000bb" 00:26:35.505 ], 00:26:35.505 "product_name": "NVMe disk", 00:26:35.505 "block_size": 512, 00:26:35.505 "num_blocks": 3750748848, 00:26:35.505 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:26:35.505 "assigned_rate_limits": { 00:26:35.505 "rw_ios_per_sec": 0, 00:26:35.505 "rw_mbytes_per_sec": 0, 00:26:35.505 "r_mbytes_per_sec": 0, 00:26:35.505 "w_mbytes_per_sec": 0 00:26:35.505 }, 00:26:35.505 "claimed": false, 00:26:35.505 "zoned": false, 00:26:35.505 "supported_io_types": { 00:26:35.505 "read": true, 00:26:35.505 "write": true, 00:26:35.505 "unmap": true, 00:26:35.505 "write_zeroes": true, 00:26:35.505 "flush": true, 00:26:35.505 "reset": true, 00:26:35.505 "compare": true, 00:26:35.505 "compare_and_write": false, 00:26:35.505 "abort": true, 00:26:35.505 "nvme_admin": true, 00:26:35.505 "nvme_io": true 00:26:35.505 }, 00:26:35.505 "driver_specific": { 00:26:35.505 "nvme": [ 00:26:35.505 { 00:26:35.505 "pci_address": "0000:65:00.0", 00:26:35.505 "trid": { 00:26:35.505 "trtype": "PCIe", 00:26:35.505 "traddr": "0000:65:00.0" 00:26:35.505 }, 00:26:35.505 "ctrlr_data": { 00:26:35.505 "cntlid": 6, 00:26:35.505 "vendor_id": "0x144d", 00:26:35.505 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:26:35.505 "serial_number": "S64GNE0R605504", 00:26:35.505 "firmware_revision": "GDC5302Q", 00:26:35.505 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:26:35.505 "oacs": { 00:26:35.505 "security": 1, 00:26:35.505 "format": 1, 00:26:35.505 "firmware": 1, 00:26:35.505 "ns_manage": 1 00:26:35.505 }, 00:26:35.505 "multi_ctrlr": false, 00:26:35.505 "ana_reporting": false 00:26:35.505 }, 00:26:35.505 "vs": { 00:26:35.505 "nvme_version": "1.4" 00:26:35.505 }, 00:26:35.505 "ns_data": { 00:26:35.505 "id": 1, 00:26:35.505 "can_share": false 00:26:35.505 }, 00:26:35.505 "security": { 00:26:35.505 "opal": true 00:26:35.505 } 00:26:35.505 } 00:26:35.505 ], 00:26:35.505 "mp_policy": "active_passive" 00:26:35.505 } 00:26:35.505 } 00:26:35.505 ] 00:26:35.505 13:54:49 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:35.505 13:54:49 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:36.448 362c7246-9083-42a5-9b82-af06f8b67b4e 00:26:36.448 13:54:50 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:36.448 785edba0-3ce6-402e-9cd1-492acf266dda 00:26:36.448 13:54:50 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:36.448 13:54:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:36.448 13:54:50 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:36.448 13:54:50 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:36.448 13:54:50 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:36.448 13:54:50 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:36.448 13:54:50 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:36.709 13:54:51 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:36.969 [ 00:26:36.969 { 00:26:36.969 "name": "785edba0-3ce6-402e-9cd1-492acf266dda", 00:26:36.969 "aliases": [ 00:26:36.969 "lvs0/lv0" 00:26:36.969 ], 00:26:36.969 "product_name": "Logical Volume", 00:26:36.969 "block_size": 512, 00:26:36.969 "num_blocks": 204800, 00:26:36.969 "uuid": "785edba0-3ce6-402e-9cd1-492acf266dda", 00:26:36.969 "assigned_rate_limits": { 00:26:36.969 "rw_ios_per_sec": 0, 00:26:36.969 "rw_mbytes_per_sec": 0, 00:26:36.969 "r_mbytes_per_sec": 0, 00:26:36.969 "w_mbytes_per_sec": 0 00:26:36.969 }, 00:26:36.969 "claimed": false, 00:26:36.969 "zoned": false, 00:26:36.969 "supported_io_types": { 00:26:36.969 "read": true, 00:26:36.969 "write": true, 00:26:36.969 "unmap": true, 00:26:36.969 "write_zeroes": true, 00:26:36.969 "flush": false, 00:26:36.969 "reset": true, 00:26:36.969 "compare": false, 00:26:36.969 "compare_and_write": false, 00:26:36.969 "abort": false, 00:26:36.969 "nvme_admin": false, 00:26:36.969 "nvme_io": false 00:26:36.969 }, 00:26:36.969 "driver_specific": { 00:26:36.969 "lvol": { 00:26:36.969 "lvol_store_uuid": "362c7246-9083-42a5-9b82-af06f8b67b4e", 00:26:36.969 "base_bdev": "Nvme0n1", 00:26:36.969 "thin_provision": true, 00:26:36.969 "num_allocated_clusters": 0, 00:26:36.969 "snapshot": false, 00:26:36.969 "clone": false, 00:26:36.969 "esnap_clone": false 00:26:36.969 } 00:26:36.969 } 00:26:36.969 } 00:26:36.969 ] 00:26:36.969 13:54:51 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:36.969 13:54:51 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:26:36.969 13:54:51 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:26:36.969 [2024-06-10 13:54:51.437208] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:36.969 COMP_lvs0/lv0 00:26:37.230 13:54:51 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:37.230 13:54:51 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:37.491 [ 00:26:37.491 { 00:26:37.491 "name": "COMP_lvs0/lv0", 00:26:37.491 "aliases": [ 00:26:37.491 "69d1a7ad-d2b0-549f-8a2d-0d05d1b71d87" 00:26:37.491 ], 00:26:37.491 "product_name": "compress", 00:26:37.491 "block_size": 4096, 00:26:37.491 "num_blocks": 25088, 00:26:37.491 "uuid": "69d1a7ad-d2b0-549f-8a2d-0d05d1b71d87", 00:26:37.491 "assigned_rate_limits": { 00:26:37.491 "rw_ios_per_sec": 0, 00:26:37.491 "rw_mbytes_per_sec": 0, 00:26:37.491 "r_mbytes_per_sec": 0, 00:26:37.491 "w_mbytes_per_sec": 0 00:26:37.491 }, 00:26:37.491 "claimed": false, 00:26:37.491 "zoned": false, 00:26:37.491 "supported_io_types": { 00:26:37.491 "read": true, 00:26:37.491 "write": true, 00:26:37.491 "unmap": false, 00:26:37.491 "write_zeroes": true, 00:26:37.491 "flush": false, 00:26:37.491 "reset": false, 00:26:37.491 "compare": false, 00:26:37.491 "compare_and_write": false, 00:26:37.491 "abort": false, 00:26:37.491 "nvme_admin": false, 00:26:37.491 "nvme_io": false 00:26:37.491 }, 00:26:37.491 "driver_specific": { 00:26:37.491 "compress": { 00:26:37.491 "name": "COMP_lvs0/lv0", 00:26:37.491 "base_bdev_name": "785edba0-3ce6-402e-9cd1-492acf266dda" 00:26:37.491 } 00:26:37.491 } 00:26:37.491 } 00:26:37.491 ] 00:26:37.491 13:54:51 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:37.491 13:54:51 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:37.750 Running I/O for 3 seconds... 00:26:41.046 00:26:41.046 Latency(us) 00:26:41.046 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:41.046 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:41.047 Verification LBA range: start 0x0 length 0x3100 00:26:41.047 COMP_lvs0/lv0 : 3.00 4397.52 17.18 0.00 0.00 7225.87 621.23 6198.61 00:26:41.047 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:41.047 Verification LBA range: start 0x3100 length 0x3100 00:26:41.047 COMP_lvs0/lv0 : 3.00 4410.20 17.23 0.00 0.00 7215.08 556.37 6034.77 00:26:41.047 =================================================================================================================== 00:26:41.047 Total : 8807.72 34.41 0.00 0.00 7220.47 556.37 6198.61 00:26:41.047 0 00:26:41.047 13:54:55 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:26:41.047 13:54:55 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:41.047 13:54:55 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:41.047 13:54:55 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:41.047 13:54:55 compress_isal -- compress/compress.sh@78 -- # killprocess 1710724 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1710724 ']' 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1710724 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@954 -- # uname 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1710724 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1710724' 00:26:41.047 killing process with pid 1710724 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@968 -- # kill 1710724 00:26:41.047 Received shutdown signal, test time was about 3.000000 seconds 00:26:41.047 00:26:41.047 Latency(us) 00:26:41.047 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:41.047 =================================================================================================================== 00:26:41.047 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:41.047 13:54:55 compress_isal -- common/autotest_common.sh@973 -- # wait 1710724 00:26:43.587 13:54:57 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:26:43.587 13:54:57 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:43.587 13:54:57 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1712607 00:26:43.587 13:54:57 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:43.587 13:54:57 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1712607 00:26:43.587 13:54:57 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:26:43.587 13:54:57 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 1712607 ']' 00:26:43.587 13:54:57 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:43.587 13:54:57 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:43.587 13:54:57 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:43.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:43.587 13:54:57 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:43.587 13:54:57 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:43.587 [2024-06-10 13:54:57.520441] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:43.587 [2024-06-10 13:54:57.520494] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1712607 ] 00:26:43.587 [2024-06-10 13:54:57.611742] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:43.587 [2024-06-10 13:54:57.698851] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:26:43.587 [2024-06-10 13:54:57.698979] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:26:43.587 [2024-06-10 13:54:57.698983] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:26:44.157 13:54:58 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:44.157 13:54:58 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:26:44.157 13:54:58 compress_isal -- compress/compress.sh@58 -- # create_vols 00:26:44.157 13:54:58 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:44.157 13:54:58 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:44.727 13:54:58 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:44.727 13:54:58 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:26:44.727 13:54:58 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:44.727 13:54:58 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:44.727 13:54:58 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:44.727 13:54:58 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:44.727 13:54:58 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:44.727 13:54:59 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:44.987 [ 00:26:44.987 { 00:26:44.987 "name": "Nvme0n1", 00:26:44.987 "aliases": [ 00:26:44.987 "36344730-5260-5504-0025-3845000000bb" 00:26:44.987 ], 00:26:44.987 "product_name": "NVMe disk", 00:26:44.987 "block_size": 512, 00:26:44.987 "num_blocks": 3750748848, 00:26:44.987 "uuid": "36344730-5260-5504-0025-3845000000bb", 00:26:44.987 "assigned_rate_limits": { 00:26:44.987 "rw_ios_per_sec": 0, 00:26:44.987 "rw_mbytes_per_sec": 0, 00:26:44.987 "r_mbytes_per_sec": 0, 00:26:44.987 "w_mbytes_per_sec": 0 00:26:44.987 }, 00:26:44.987 "claimed": false, 00:26:44.987 "zoned": false, 00:26:44.987 "supported_io_types": { 00:26:44.987 "read": true, 00:26:44.987 "write": true, 00:26:44.987 "unmap": true, 00:26:44.987 "write_zeroes": true, 00:26:44.987 "flush": true, 00:26:44.987 "reset": true, 00:26:44.987 "compare": true, 00:26:44.987 "compare_and_write": false, 00:26:44.987 "abort": true, 00:26:44.987 "nvme_admin": true, 00:26:44.987 "nvme_io": true 00:26:44.987 }, 00:26:44.987 "driver_specific": { 00:26:44.987 "nvme": [ 00:26:44.987 { 00:26:44.987 "pci_address": "0000:65:00.0", 00:26:44.987 "trid": { 00:26:44.987 "trtype": "PCIe", 00:26:44.987 "traddr": "0000:65:00.0" 00:26:44.987 }, 00:26:44.987 "ctrlr_data": { 00:26:44.987 "cntlid": 6, 00:26:44.987 "vendor_id": "0x144d", 00:26:44.987 "model_number": "SAMSUNG MZQL21T9HCJR-00A07", 00:26:44.987 "serial_number": "S64GNE0R605504", 00:26:44.987 "firmware_revision": "GDC5302Q", 00:26:44.987 "subnqn": "nqn.1994-11.com.samsung:nvme:PM9A3:2.5-inch:S64GNE0R605504 ", 00:26:44.987 "oacs": { 00:26:44.987 "security": 1, 00:26:44.987 "format": 1, 00:26:44.987 "firmware": 1, 00:26:44.987 "ns_manage": 1 00:26:44.987 }, 00:26:44.987 "multi_ctrlr": false, 00:26:44.987 "ana_reporting": false 00:26:44.987 }, 00:26:44.987 "vs": { 00:26:44.987 "nvme_version": "1.4" 00:26:44.987 }, 00:26:44.987 "ns_data": { 00:26:44.987 "id": 1, 00:26:44.987 "can_share": false 00:26:44.987 }, 00:26:44.987 "security": { 00:26:44.987 "opal": true 00:26:44.987 } 00:26:44.987 } 00:26:44.987 ], 00:26:44.987 "mp_policy": "active_passive" 00:26:44.987 } 00:26:44.987 } 00:26:44.987 ] 00:26:44.987 13:54:59 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:44.987 13:54:59 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:45.925 6184cee8-2e62-4693-9557-dbf62944e03d 00:26:45.925 13:55:00 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:46.183 1514c807-816a-4813-8a5e-9af04f769818 00:26:46.183 13:55:00 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:46.183 13:55:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:26:46.183 13:55:00 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:46.183 13:55:00 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:46.183 13:55:00 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:46.183 13:55:00 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:46.183 13:55:00 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:46.184 13:55:00 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:46.443 [ 00:26:46.443 { 00:26:46.443 "name": "1514c807-816a-4813-8a5e-9af04f769818", 00:26:46.443 "aliases": [ 00:26:46.443 "lvs0/lv0" 00:26:46.443 ], 00:26:46.443 "product_name": "Logical Volume", 00:26:46.443 "block_size": 512, 00:26:46.444 "num_blocks": 204800, 00:26:46.444 "uuid": "1514c807-816a-4813-8a5e-9af04f769818", 00:26:46.444 "assigned_rate_limits": { 00:26:46.444 "rw_ios_per_sec": 0, 00:26:46.444 "rw_mbytes_per_sec": 0, 00:26:46.444 "r_mbytes_per_sec": 0, 00:26:46.444 "w_mbytes_per_sec": 0 00:26:46.444 }, 00:26:46.444 "claimed": false, 00:26:46.444 "zoned": false, 00:26:46.444 "supported_io_types": { 00:26:46.444 "read": true, 00:26:46.444 "write": true, 00:26:46.444 "unmap": true, 00:26:46.444 "write_zeroes": true, 00:26:46.444 "flush": false, 00:26:46.444 "reset": true, 00:26:46.444 "compare": false, 00:26:46.444 "compare_and_write": false, 00:26:46.444 "abort": false, 00:26:46.444 "nvme_admin": false, 00:26:46.444 "nvme_io": false 00:26:46.444 }, 00:26:46.444 "driver_specific": { 00:26:46.444 "lvol": { 00:26:46.444 "lvol_store_uuid": "6184cee8-2e62-4693-9557-dbf62944e03d", 00:26:46.444 "base_bdev": "Nvme0n1", 00:26:46.444 "thin_provision": true, 00:26:46.444 "num_allocated_clusters": 0, 00:26:46.444 "snapshot": false, 00:26:46.444 "clone": false, 00:26:46.444 "esnap_clone": false 00:26:46.444 } 00:26:46.444 } 00:26:46.444 } 00:26:46.444 ] 00:26:46.444 13:55:00 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:46.444 13:55:00 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:46.444 13:55:00 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:46.704 [2024-06-10 13:55:01.054347] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:46.704 COMP_lvs0/lv0 00:26:46.704 13:55:01 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:46.704 13:55:01 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:26:46.704 13:55:01 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:46.704 13:55:01 compress_isal -- common/autotest_common.sh@900 -- # local i 00:26:46.704 13:55:01 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:46.704 13:55:01 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:46.704 13:55:01 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:46.964 13:55:01 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:47.227 [ 00:26:47.228 { 00:26:47.228 "name": "COMP_lvs0/lv0", 00:26:47.228 "aliases": [ 00:26:47.228 "3aab4669-d255-5787-a0e3-9952956c07c0" 00:26:47.228 ], 00:26:47.228 "product_name": "compress", 00:26:47.228 "block_size": 512, 00:26:47.228 "num_blocks": 200704, 00:26:47.228 "uuid": "3aab4669-d255-5787-a0e3-9952956c07c0", 00:26:47.228 "assigned_rate_limits": { 00:26:47.228 "rw_ios_per_sec": 0, 00:26:47.228 "rw_mbytes_per_sec": 0, 00:26:47.228 "r_mbytes_per_sec": 0, 00:26:47.228 "w_mbytes_per_sec": 0 00:26:47.228 }, 00:26:47.228 "claimed": false, 00:26:47.228 "zoned": false, 00:26:47.228 "supported_io_types": { 00:26:47.228 "read": true, 00:26:47.228 "write": true, 00:26:47.228 "unmap": false, 00:26:47.228 "write_zeroes": true, 00:26:47.228 "flush": false, 00:26:47.228 "reset": false, 00:26:47.228 "compare": false, 00:26:47.228 "compare_and_write": false, 00:26:47.228 "abort": false, 00:26:47.228 "nvme_admin": false, 00:26:47.228 "nvme_io": false 00:26:47.228 }, 00:26:47.228 "driver_specific": { 00:26:47.228 "compress": { 00:26:47.228 "name": "COMP_lvs0/lv0", 00:26:47.228 "base_bdev_name": "1514c807-816a-4813-8a5e-9af04f769818" 00:26:47.228 } 00:26:47.228 } 00:26:47.228 } 00:26:47.228 ] 00:26:47.228 13:55:01 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:26:47.228 13:55:01 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:47.228 I/O targets: 00:26:47.228 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:26:47.228 00:26:47.228 00:26:47.228 CUnit - A unit testing framework for C - Version 2.1-3 00:26:47.228 http://cunit.sourceforge.net/ 00:26:47.228 00:26:47.228 00:26:47.228 Suite: bdevio tests on: COMP_lvs0/lv0 00:26:47.228 Test: blockdev write read block ...passed 00:26:47.228 Test: blockdev write zeroes read block ...passed 00:26:47.228 Test: blockdev write zeroes read no split ...passed 00:26:47.228 Test: blockdev write zeroes read split ...passed 00:26:47.228 Test: blockdev write zeroes read split partial ...passed 00:26:47.228 Test: blockdev reset ...[2024-06-10 13:55:01.658016] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:26:47.228 passed 00:26:47.228 Test: blockdev write read 8 blocks ...passed 00:26:47.228 Test: blockdev write read size > 128k ...passed 00:26:47.228 Test: blockdev write read invalid size ...passed 00:26:47.228 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:47.228 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:47.228 Test: blockdev write read max offset ...passed 00:26:47.228 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:47.228 Test: blockdev writev readv 8 blocks ...passed 00:26:47.228 Test: blockdev writev readv 30 x 1block ...passed 00:26:47.228 Test: blockdev writev readv block ...passed 00:26:47.228 Test: blockdev writev readv size > 128k ...passed 00:26:47.228 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:47.228 Test: blockdev comparev and writev ...passed 00:26:47.228 Test: blockdev nvme passthru rw ...passed 00:26:47.228 Test: blockdev nvme passthru vendor specific ...passed 00:26:47.228 Test: blockdev nvme admin passthru ...passed 00:26:47.228 Test: blockdev copy ...passed 00:26:47.228 00:26:47.228 Run Summary: Type Total Ran Passed Failed Inactive 00:26:47.228 suites 1 1 n/a 0 0 00:26:47.228 tests 23 23 23 0 0 00:26:47.228 asserts 130 130 130 0 n/a 00:26:47.228 00:26:47.228 Elapsed time = 0.101 seconds 00:26:47.228 0 00:26:47.228 13:55:01 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:26:47.228 13:55:01 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:47.488 13:55:01 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:47.757 13:55:02 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:26:47.757 13:55:02 compress_isal -- compress/compress.sh@62 -- # killprocess 1712607 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 1712607 ']' 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@953 -- # kill -0 1712607 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@954 -- # uname 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1712607 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1712607' 00:26:47.757 killing process with pid 1712607 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@968 -- # kill 1712607 00:26:47.757 13:55:02 compress_isal -- common/autotest_common.sh@973 -- # wait 1712607 00:26:49.733 13:55:04 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:26:49.733 13:55:04 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:26:49.733 00:26:49.733 real 0m35.529s 00:26:49.733 user 1m24.772s 00:26:49.733 sys 0m2.977s 00:26:49.733 13:55:04 compress_isal -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:49.733 13:55:04 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:49.733 ************************************ 00:26:49.733 END TEST compress_isal 00:26:49.733 ************************************ 00:26:49.733 13:55:04 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:26:49.733 13:55:04 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:26:49.733 13:55:04 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:26:49.733 13:55:04 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:49.733 13:55:04 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:49.733 13:55:04 -- common/autotest_common.sh@10 -- # set +x 00:26:49.733 ************************************ 00:26:49.733 START TEST blockdev_crypto_aesni 00:26:49.733 ************************************ 00:26:49.733 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:26:49.993 * Looking for test storage... 00:26:49.993 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1713972 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1713972 00:26:49.993 13:55:04 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:49.993 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@830 -- # '[' -z 1713972 ']' 00:26:49.993 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:49.993 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:49.993 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:49.993 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:49.994 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:49.994 13:55:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:49.994 [2024-06-10 13:55:04.376600] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:49.994 [2024-06-10 13:55:04.376662] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713972 ] 00:26:50.254 [2024-06-10 13:55:04.469048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:50.254 [2024-06-10 13:55:04.536917] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:26:50.823 13:55:05 blockdev_crypto_aesni -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:50.823 13:55:05 blockdev_crypto_aesni -- common/autotest_common.sh@863 -- # return 0 00:26:50.823 13:55:05 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:26:50.823 13:55:05 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:26:50.823 13:55:05 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:26:50.823 13:55:05 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:50.823 13:55:05 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:50.823 [2024-06-10 13:55:05.194804] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:50.823 [2024-06-10 13:55:05.202839] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:50.823 [2024-06-10 13:55:05.210852] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:50.823 [2024-06-10 13:55:05.260167] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:53.366 true 00:26:53.366 true 00:26:53.366 true 00:26:53.366 true 00:26:53.366 Malloc0 00:26:53.366 Malloc1 00:26:53.366 Malloc2 00:26:53.366 Malloc3 00:26:53.366 [2024-06-10 13:55:07.527623] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:53.366 crypto_ram 00:26:53.366 [2024-06-10 13:55:07.535644] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:53.366 crypto_ram2 00:26:53.366 [2024-06-10 13:55:07.543664] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:53.366 crypto_ram3 00:26:53.366 [2024-06-10 13:55:07.551687] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:53.366 crypto_ram4 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:26:53.366 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:26:53.366 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "3b692ba8-9015-5d5c-9c47-6170b0b13e59"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3b692ba8-9015-5d5c-9c47-6170b0b13e59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "bfaa8b37-d3fa-5572-9e4f-3a665018ac1c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bfaa8b37-d3fa-5572-9e4f-3a665018ac1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "5f3f0031-f9ea-54d5-ba29-ad2d1234c101"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5f3f0031-f9ea-54d5-ba29-ad2d1234c101",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "f98f1745-64b0-5af7-96b8-47d9b4c4629b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f98f1745-64b0-5af7-96b8-47d9b4c4629b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:26:53.367 13:55:07 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 1713972 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@949 -- # '[' -z 1713972 ']' 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # kill -0 1713972 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # uname 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1713972 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1713972' 00:26:53.367 killing process with pid 1713972 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # kill 1713972 00:26:53.367 13:55:07 blockdev_crypto_aesni -- common/autotest_common.sh@973 -- # wait 1713972 00:26:53.628 13:55:08 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:53.628 13:55:08 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:53.628 13:55:08 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:26:53.628 13:55:08 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:53.628 13:55:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:53.628 ************************************ 00:26:53.628 START TEST bdev_hello_world 00:26:53.628 ************************************ 00:26:53.628 13:55:08 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:53.888 [2024-06-10 13:55:08.142814] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:53.888 [2024-06-10 13:55:08.142863] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1714792 ] 00:26:53.888 [2024-06-10 13:55:08.231991] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:53.888 [2024-06-10 13:55:08.305269] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:26:53.888 [2024-06-10 13:55:08.326337] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:53.888 [2024-06-10 13:55:08.334365] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:53.888 [2024-06-10 13:55:08.342380] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:54.148 [2024-06-10 13:55:08.427241] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:56.689 [2024-06-10 13:55:10.572191] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:56.689 [2024-06-10 13:55:10.572249] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:56.689 [2024-06-10 13:55:10.572258] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.689 [2024-06-10 13:55:10.580208] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:56.689 [2024-06-10 13:55:10.580220] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:56.689 [2024-06-10 13:55:10.580226] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.689 [2024-06-10 13:55:10.588228] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:56.689 [2024-06-10 13:55:10.588239] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:56.689 [2024-06-10 13:55:10.588249] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.689 [2024-06-10 13:55:10.596248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:56.689 [2024-06-10 13:55:10.596259] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:56.689 [2024-06-10 13:55:10.596265] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.689 [2024-06-10 13:55:10.658769] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:26:56.689 [2024-06-10 13:55:10.658798] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:26:56.689 [2024-06-10 13:55:10.658810] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:26:56.689 [2024-06-10 13:55:10.659912] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:26:56.689 [2024-06-10 13:55:10.659973] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:26:56.689 [2024-06-10 13:55:10.659983] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:26:56.689 [2024-06-10 13:55:10.660017] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:26:56.689 00:26:56.689 [2024-06-10 13:55:10.660028] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:26:56.689 00:26:56.689 real 0m2.787s 00:26:56.689 user 0m2.542s 00:26:56.689 sys 0m0.215s 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:26:56.689 ************************************ 00:26:56.689 END TEST bdev_hello_world 00:26:56.689 ************************************ 00:26:56.689 13:55:10 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:26:56.689 13:55:10 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:26:56.689 13:55:10 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:56.689 13:55:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:56.689 ************************************ 00:26:56.689 START TEST bdev_bounds 00:26:56.689 ************************************ 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1715228 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1715228' 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:56.689 Process bdevio pid: 1715228 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1715228 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1715228 ']' 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:56.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:56.689 13:55:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:56.689 [2024-06-10 13:55:11.007890] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:26:56.689 [2024-06-10 13:55:11.007939] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1715228 ] 00:26:56.689 [2024-06-10 13:55:11.097713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:56.951 [2024-06-10 13:55:11.165462] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:26:56.951 [2024-06-10 13:55:11.165632] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:26:56.951 [2024-06-10 13:55:11.165636] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:26:56.951 [2024-06-10 13:55:11.186752] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:56.951 [2024-06-10 13:55:11.194783] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:56.951 [2024-06-10 13:55:11.202798] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:56.951 [2024-06-10 13:55:11.288961] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:59.497 [2024-06-10 13:55:13.437907] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:59.497 [2024-06-10 13:55:13.437966] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:59.497 [2024-06-10 13:55:13.437975] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:59.497 [2024-06-10 13:55:13.445925] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:59.497 [2024-06-10 13:55:13.445938] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:59.497 [2024-06-10 13:55:13.445944] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:59.497 [2024-06-10 13:55:13.453945] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:59.497 [2024-06-10 13:55:13.453956] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:59.497 [2024-06-10 13:55:13.453962] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:59.497 [2024-06-10 13:55:13.461967] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:59.497 [2024-06-10 13:55:13.461978] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:59.497 [2024-06-10 13:55:13.461984] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:59.497 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:59.497 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:26:59.497 13:55:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:59.497 I/O targets: 00:26:59.497 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:26:59.497 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:26:59.497 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:26:59.497 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:26:59.497 00:26:59.497 00:26:59.497 CUnit - A unit testing framework for C - Version 2.1-3 00:26:59.497 http://cunit.sourceforge.net/ 00:26:59.497 00:26:59.497 00:26:59.497 Suite: bdevio tests on: crypto_ram4 00:26:59.497 Test: blockdev write read block ...passed 00:26:59.497 Test: blockdev write zeroes read block ...passed 00:26:59.497 Test: blockdev write zeroes read no split ...passed 00:26:59.497 Test: blockdev write zeroes read split ...passed 00:26:59.497 Test: blockdev write zeroes read split partial ...passed 00:26:59.497 Test: blockdev reset ...passed 00:26:59.497 Test: blockdev write read 8 blocks ...passed 00:26:59.497 Test: blockdev write read size > 128k ...passed 00:26:59.497 Test: blockdev write read invalid size ...passed 00:26:59.497 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:59.497 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:59.497 Test: blockdev write read max offset ...passed 00:26:59.497 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:59.497 Test: blockdev writev readv 8 blocks ...passed 00:26:59.497 Test: blockdev writev readv 30 x 1block ...passed 00:26:59.497 Test: blockdev writev readv block ...passed 00:26:59.497 Test: blockdev writev readv size > 128k ...passed 00:26:59.497 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:59.497 Test: blockdev comparev and writev ...passed 00:26:59.497 Test: blockdev nvme passthru rw ...passed 00:26:59.498 Test: blockdev nvme passthru vendor specific ...passed 00:26:59.498 Test: blockdev nvme admin passthru ...passed 00:26:59.498 Test: blockdev copy ...passed 00:26:59.498 Suite: bdevio tests on: crypto_ram3 00:26:59.498 Test: blockdev write read block ...passed 00:26:59.498 Test: blockdev write zeroes read block ...passed 00:26:59.498 Test: blockdev write zeroes read no split ...passed 00:26:59.498 Test: blockdev write zeroes read split ...passed 00:26:59.498 Test: blockdev write zeroes read split partial ...passed 00:26:59.498 Test: blockdev reset ...passed 00:26:59.498 Test: blockdev write read 8 blocks ...passed 00:26:59.498 Test: blockdev write read size > 128k ...passed 00:26:59.498 Test: blockdev write read invalid size ...passed 00:26:59.498 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:59.498 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:59.498 Test: blockdev write read max offset ...passed 00:26:59.498 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:59.498 Test: blockdev writev readv 8 blocks ...passed 00:26:59.498 Test: blockdev writev readv 30 x 1block ...passed 00:26:59.498 Test: blockdev writev readv block ...passed 00:26:59.498 Test: blockdev writev readv size > 128k ...passed 00:26:59.498 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:59.498 Test: blockdev comparev and writev ...passed 00:26:59.498 Test: blockdev nvme passthru rw ...passed 00:26:59.498 Test: blockdev nvme passthru vendor specific ...passed 00:26:59.498 Test: blockdev nvme admin passthru ...passed 00:26:59.498 Test: blockdev copy ...passed 00:26:59.498 Suite: bdevio tests on: crypto_ram2 00:26:59.498 Test: blockdev write read block ...passed 00:26:59.498 Test: blockdev write zeroes read block ...passed 00:26:59.498 Test: blockdev write zeroes read no split ...passed 00:26:59.498 Test: blockdev write zeroes read split ...passed 00:26:59.498 Test: blockdev write zeroes read split partial ...passed 00:26:59.498 Test: blockdev reset ...passed 00:26:59.498 Test: blockdev write read 8 blocks ...passed 00:26:59.498 Test: blockdev write read size > 128k ...passed 00:26:59.498 Test: blockdev write read invalid size ...passed 00:26:59.498 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:59.498 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:59.498 Test: blockdev write read max offset ...passed 00:26:59.498 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:59.498 Test: blockdev writev readv 8 blocks ...passed 00:26:59.498 Test: blockdev writev readv 30 x 1block ...passed 00:26:59.498 Test: blockdev writev readv block ...passed 00:26:59.498 Test: blockdev writev readv size > 128k ...passed 00:26:59.498 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:59.498 Test: blockdev comparev and writev ...passed 00:26:59.498 Test: blockdev nvme passthru rw ...passed 00:26:59.498 Test: blockdev nvme passthru vendor specific ...passed 00:26:59.498 Test: blockdev nvme admin passthru ...passed 00:26:59.498 Test: blockdev copy ...passed 00:26:59.498 Suite: bdevio tests on: crypto_ram 00:26:59.498 Test: blockdev write read block ...passed 00:26:59.498 Test: blockdev write zeroes read block ...passed 00:26:59.498 Test: blockdev write zeroes read no split ...passed 00:26:59.498 Test: blockdev write zeroes read split ...passed 00:26:59.498 Test: blockdev write zeroes read split partial ...passed 00:26:59.498 Test: blockdev reset ...passed 00:26:59.498 Test: blockdev write read 8 blocks ...passed 00:26:59.498 Test: blockdev write read size > 128k ...passed 00:26:59.498 Test: blockdev write read invalid size ...passed 00:26:59.498 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:59.498 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:59.498 Test: blockdev write read max offset ...passed 00:26:59.498 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:59.498 Test: blockdev writev readv 8 blocks ...passed 00:26:59.498 Test: blockdev writev readv 30 x 1block ...passed 00:26:59.498 Test: blockdev writev readv block ...passed 00:26:59.498 Test: blockdev writev readv size > 128k ...passed 00:26:59.498 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:59.498 Test: blockdev comparev and writev ...passed 00:26:59.498 Test: blockdev nvme passthru rw ...passed 00:26:59.498 Test: blockdev nvme passthru vendor specific ...passed 00:26:59.498 Test: blockdev nvme admin passthru ...passed 00:26:59.498 Test: blockdev copy ...passed 00:26:59.498 00:26:59.498 Run Summary: Type Total Ran Passed Failed Inactive 00:26:59.498 suites 4 4 n/a 0 0 00:26:59.498 tests 92 92 92 0 0 00:26:59.498 asserts 520 520 520 0 n/a 00:26:59.498 00:26:59.498 Elapsed time = 0.486 seconds 00:26:59.498 0 00:26:59.498 13:55:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1715228 00:26:59.498 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1715228 ']' 00:26:59.498 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1715228 00:26:59.498 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:26:59.498 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:59.498 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1715228 00:26:59.758 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:59.758 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:59.758 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1715228' 00:26:59.758 killing process with pid 1715228 00:26:59.758 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1715228 00:26:59.758 13:55:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1715228 00:26:59.758 13:55:14 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:26:59.758 00:26:59.758 real 0m3.253s 00:26:59.758 user 0m9.346s 00:26:59.758 sys 0m0.362s 00:26:59.758 13:55:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:59.758 13:55:14 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:59.758 ************************************ 00:26:59.758 END TEST bdev_bounds 00:26:59.758 ************************************ 00:27:00.019 13:55:14 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:27:00.019 13:55:14 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:00.019 13:55:14 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:00.019 13:55:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:00.019 ************************************ 00:27:00.019 START TEST bdev_nbd 00:27:00.019 ************************************ 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1715856 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1715856 /var/tmp/spdk-nbd.sock 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1715856 ']' 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:00.019 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:00.019 13:55:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:00.019 [2024-06-10 13:55:14.340122] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:00.019 [2024-06-10 13:55:14.340183] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:00.019 [2024-06-10 13:55:14.431350] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:00.280 [2024-06-10 13:55:14.498390] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:00.281 [2024-06-10 13:55:14.519453] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:00.281 [2024-06-10 13:55:14.527479] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:00.281 [2024-06-10 13:55:14.535493] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:00.281 [2024-06-10 13:55:14.621343] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:02.824 [2024-06-10 13:55:16.767405] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:02.824 [2024-06-10 13:55:16.767459] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:02.824 [2024-06-10 13:55:16.767468] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.824 [2024-06-10 13:55:16.775425] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:02.824 [2024-06-10 13:55:16.775441] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:02.824 [2024-06-10 13:55:16.775447] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.824 [2024-06-10 13:55:16.783444] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:02.824 [2024-06-10 13:55:16.783456] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:02.824 [2024-06-10 13:55:16.783462] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.824 [2024-06-10 13:55:16.791465] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:02.824 [2024-06-10 13:55:16.791476] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:02.824 [2024-06-10 13:55:16.791481] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:02.824 13:55:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:02.824 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:02.825 1+0 records in 00:27:02.825 1+0 records out 00:27:02.825 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000144529 s, 28.3 MB/s 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:02.825 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:27:03.086 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:03.087 1+0 records in 00:27:03.087 1+0 records out 00:27:03.087 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318857 s, 12.8 MB/s 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:03.087 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:03.349 1+0 records in 00:27:03.349 1+0 records out 00:27:03.349 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254476 s, 16.1 MB/s 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:03.349 1+0 records in 00:27:03.349 1+0 records out 00:27:03.349 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292039 s, 14.0 MB/s 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:03.349 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:03.611 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:03.611 13:55:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:03.611 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:03.611 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:27:03.611 13:55:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd0", 00:27:03.611 "bdev_name": "crypto_ram" 00:27:03.611 }, 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd1", 00:27:03.611 "bdev_name": "crypto_ram2" 00:27:03.611 }, 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd2", 00:27:03.611 "bdev_name": "crypto_ram3" 00:27:03.611 }, 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd3", 00:27:03.611 "bdev_name": "crypto_ram4" 00:27:03.611 } 00:27:03.611 ]' 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd0", 00:27:03.611 "bdev_name": "crypto_ram" 00:27:03.611 }, 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd1", 00:27:03.611 "bdev_name": "crypto_ram2" 00:27:03.611 }, 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd2", 00:27:03.611 "bdev_name": "crypto_ram3" 00:27:03.611 }, 00:27:03.611 { 00:27:03.611 "nbd_device": "/dev/nbd3", 00:27:03.611 "bdev_name": "crypto_ram4" 00:27:03.611 } 00:27:03.611 ]' 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:03.611 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:03.872 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:04.132 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:04.393 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:04.653 13:55:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:04.653 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:04.653 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:04.653 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:27:04.914 /dev/nbd0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:04.914 1+0 records in 00:27:04.914 1+0 records out 00:27:04.914 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279647 s, 14.6 MB/s 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:04.914 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:27:05.174 /dev/nbd1 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:05.174 1+0 records in 00:27:05.174 1+0 records out 00:27:05.174 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260805 s, 15.7 MB/s 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:05.174 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:05.175 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:05.175 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:05.175 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:05.175 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:05.175 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:27:05.436 /dev/nbd10 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:05.436 1+0 records in 00:27:05.436 1+0 records out 00:27:05.436 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302929 s, 13.5 MB/s 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:05.436 13:55:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:27:05.697 /dev/nbd11 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:05.697 1+0 records in 00:27:05.697 1+0 records out 00:27:05.697 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275149 s, 14.9 MB/s 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:05.697 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd0", 00:27:05.957 "bdev_name": "crypto_ram" 00:27:05.957 }, 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd1", 00:27:05.957 "bdev_name": "crypto_ram2" 00:27:05.957 }, 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd10", 00:27:05.957 "bdev_name": "crypto_ram3" 00:27:05.957 }, 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd11", 00:27:05.957 "bdev_name": "crypto_ram4" 00:27:05.957 } 00:27:05.957 ]' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd0", 00:27:05.957 "bdev_name": "crypto_ram" 00:27:05.957 }, 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd1", 00:27:05.957 "bdev_name": "crypto_ram2" 00:27:05.957 }, 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd10", 00:27:05.957 "bdev_name": "crypto_ram3" 00:27:05.957 }, 00:27:05.957 { 00:27:05.957 "nbd_device": "/dev/nbd11", 00:27:05.957 "bdev_name": "crypto_ram4" 00:27:05.957 } 00:27:05.957 ]' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:27:05.957 /dev/nbd1 00:27:05.957 /dev/nbd10 00:27:05.957 /dev/nbd11' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:27:05.957 /dev/nbd1 00:27:05.957 /dev/nbd10 00:27:05.957 /dev/nbd11' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:27:05.957 256+0 records in 00:27:05.957 256+0 records out 00:27:05.957 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113356 s, 92.5 MB/s 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:27:05.957 256+0 records in 00:27:05.957 256+0 records out 00:27:05.957 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0521171 s, 20.1 MB/s 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:05.957 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:27:06.218 256+0 records in 00:27:06.218 256+0 records out 00:27:06.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0457098 s, 22.9 MB/s 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:27:06.218 256+0 records in 00:27:06.218 256+0 records out 00:27:06.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0378936 s, 27.7 MB/s 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:27:06.218 256+0 records in 00:27:06.218 256+0 records out 00:27:06.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0447812 s, 23.4 MB/s 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:06.218 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:06.479 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:06.741 13:55:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:06.741 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:07.000 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:07.260 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:07.520 malloc_lvol_verify 00:27:07.520 13:55:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:07.780 baa6b5fb-c940-494d-ae11-7e22cf13db23 00:27:07.780 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:27:08.040 2ff3efc4-d452-4341-8411-30e84bb04c12 00:27:08.040 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:27:08.301 /dev/nbd0 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:27:08.301 mke2fs 1.46.5 (30-Dec-2021) 00:27:08.301 Discarding device blocks: 0/4096 done 00:27:08.301 Creating filesystem with 4096 1k blocks and 1024 inodes 00:27:08.301 00:27:08.301 Allocating group tables: 0/1 done 00:27:08.301 Writing inode tables: 0/1 done 00:27:08.301 Creating journal (1024 blocks): done 00:27:08.301 Writing superblocks and filesystem accounting information: 0/1 done 00:27:08.301 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:08.301 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1715856 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1715856 ']' 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1715856 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1715856 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1715856' 00:27:08.561 killing process with pid 1715856 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1715856 00:27:08.561 13:55:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1715856 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:27:08.832 00:27:08.832 real 0m8.922s 00:27:08.832 user 0m12.522s 00:27:08.832 sys 0m2.438s 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:08.832 ************************************ 00:27:08.832 END TEST bdev_nbd 00:27:08.832 ************************************ 00:27:08.832 13:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:27:08.832 13:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:27:08.832 13:55:23 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:08.832 13:55:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:08.832 ************************************ 00:27:08.832 START TEST bdev_fio 00:27:08.832 ************************************ 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:08.832 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:27:08.832 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:09.095 ************************************ 00:27:09.095 START TEST bdev_fio_rw_verify 00:27:09.095 ************************************ 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:09.095 13:55:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:09.354 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:09.354 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:09.354 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:09.354 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:09.354 fio-3.35 00:27:09.354 Starting 4 threads 00:27:24.249 00:27:24.249 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1718545: Mon Jun 10 13:55:36 2024 00:27:24.249 read: IOPS=30.3k, BW=118MiB/s (124MB/s)(1183MiB/10001msec) 00:27:24.249 slat (usec): min=15, max=607, avg=42.67, stdev=27.90 00:27:24.249 clat (usec): min=8, max=1446, avg=229.68, stdev=159.35 00:27:24.249 lat (usec): min=28, max=1558, avg=272.35, stdev=175.43 00:27:24.249 clat percentiles (usec): 00:27:24.249 | 50.000th=[ 186], 99.000th=[ 725], 99.900th=[ 979], 99.990th=[ 1221], 00:27:24.249 | 99.999th=[ 1401] 00:27:24.249 write: IOPS=33.3k, BW=130MiB/s (136MB/s)(1266MiB/9743msec); 0 zone resets 00:27:24.249 slat (usec): min=15, max=622, avg=53.71, stdev=27.75 00:27:24.249 clat (usec): min=26, max=2537, avg=303.39, stdev=200.93 00:27:24.249 lat (usec): min=55, max=2741, avg=357.10, stdev=217.53 00:27:24.249 clat percentiles (usec): 00:27:24.249 | 50.000th=[ 262], 99.000th=[ 898], 99.900th=[ 1188], 99.990th=[ 1647], 00:27:24.249 | 99.999th=[ 2311] 00:27:24.249 bw ( KiB/s): min=90392, max=145344, per=97.36%, avg=129544.37, stdev=3278.98, samples=76 00:27:24.249 iops : min=22598, max=36336, avg=32386.05, stdev=819.74, samples=76 00:27:24.249 lat (usec) : 10=0.01%, 20=0.01%, 50=1.86%, 100=13.48%, 250=42.61% 00:27:24.249 lat (usec) : 500=30.00%, 750=9.48%, 1000=2.32% 00:27:24.249 lat (msec) : 2=0.25%, 4=0.01% 00:27:24.249 cpu : usr=99.70%, sys=0.00%, ctx=54, majf=0, minf=269 00:27:24.249 IO depths : 1=10.4%, 2=23.6%, 4=52.8%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:24.249 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:24.249 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:24.249 issued rwts: total=302935,324081,0,0 short=0,0,0,0 dropped=0,0,0,0 00:27:24.249 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:24.249 00:27:24.249 Run status group 0 (all jobs): 00:27:24.249 READ: bw=118MiB/s (124MB/s), 118MiB/s-118MiB/s (124MB/s-124MB/s), io=1183MiB (1241MB), run=10001-10001msec 00:27:24.249 WRITE: bw=130MiB/s (136MB/s), 130MiB/s-130MiB/s (136MB/s-136MB/s), io=1266MiB (1327MB), run=9743-9743msec 00:27:24.249 00:27:24.249 real 0m13.361s 00:27:24.249 user 0m52.943s 00:27:24.249 sys 0m0.465s 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:27:24.249 ************************************ 00:27:24.249 END TEST bdev_fio_rw_verify 00:27:24.249 ************************************ 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:27:24.249 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "3b692ba8-9015-5d5c-9c47-6170b0b13e59"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3b692ba8-9015-5d5c-9c47-6170b0b13e59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "bfaa8b37-d3fa-5572-9e4f-3a665018ac1c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bfaa8b37-d3fa-5572-9e4f-3a665018ac1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "5f3f0031-f9ea-54d5-ba29-ad2d1234c101"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5f3f0031-f9ea-54d5-ba29-ad2d1234c101",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "f98f1745-64b0-5af7-96b8-47d9b4c4629b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f98f1745-64b0-5af7-96b8-47d9b4c4629b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:27:24.250 crypto_ram2 00:27:24.250 crypto_ram3 00:27:24.250 crypto_ram4 ]] 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "3b692ba8-9015-5d5c-9c47-6170b0b13e59"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3b692ba8-9015-5d5c-9c47-6170b0b13e59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "bfaa8b37-d3fa-5572-9e4f-3a665018ac1c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bfaa8b37-d3fa-5572-9e4f-3a665018ac1c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "5f3f0031-f9ea-54d5-ba29-ad2d1234c101"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "5f3f0031-f9ea-54d5-ba29-ad2d1234c101",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "f98f1745-64b0-5af7-96b8-47d9b4c4629b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f98f1745-64b0-5af7-96b8-47d9b4c4629b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:24.250 ************************************ 00:27:24.250 START TEST bdev_fio_trim 00:27:24.250 ************************************ 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:24.250 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:27:24.251 13:55:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:27:24.251 13:55:37 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:27:24.251 13:55:37 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:27:24.251 13:55:37 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:24.251 13:55:37 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:24.251 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:24.251 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:24.251 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:24.251 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:24.251 fio-3.35 00:27:24.251 Starting 4 threads 00:27:36.480 00:27:36.480 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1721327: Mon Jun 10 13:55:50 2024 00:27:36.480 write: IOPS=55.3k, BW=216MiB/s (226MB/s)(2160MiB/10001msec); 0 zone resets 00:27:36.480 slat (usec): min=15, max=903, avg=43.96, stdev=27.69 00:27:36.480 clat (usec): min=44, max=1586, avg=206.05, stdev=140.05 00:27:36.480 lat (usec): min=60, max=1872, avg=250.01, stdev=158.79 00:27:36.480 clat percentiles (usec): 00:27:36.480 | 50.000th=[ 159], 99.000th=[ 586], 99.900th=[ 742], 99.990th=[ 1123], 00:27:36.480 | 99.999th=[ 1532] 00:27:36.480 bw ( KiB/s): min=204552, max=225456, per=99.98%, avg=221098.95, stdev=1367.55, samples=76 00:27:36.480 iops : min=51138, max=56364, avg=55274.84, stdev=341.86, samples=76 00:27:36.480 trim: IOPS=55.3k, BW=216MiB/s (226MB/s)(2160MiB/10001msec); 0 zone resets 00:27:36.480 slat (usec): min=5, max=489, avg= 8.78, stdev= 4.53 00:27:36.480 clat (usec): min=60, max=1205, avg=177.85, stdev=75.52 00:27:36.480 lat (usec): min=66, max=1213, avg=186.63, stdev=76.27 00:27:36.480 clat percentiles (usec): 00:27:36.480 | 50.000th=[ 165], 99.000th=[ 396], 99.900th=[ 510], 99.990th=[ 725], 00:27:36.480 | 99.999th=[ 1037] 00:27:36.480 bw ( KiB/s): min=204552, max=225456, per=99.98%, avg=221101.05, stdev=1366.61, samples=76 00:27:36.480 iops : min=51138, max=56364, avg=55275.26, stdev=341.65, samples=76 00:27:36.480 lat (usec) : 50=1.99%, 100=17.14%, 250=58.39%, 500=20.44%, 750=1.99% 00:27:36.480 lat (usec) : 1000=0.04% 00:27:36.480 lat (msec) : 2=0.01% 00:27:36.480 cpu : usr=99.72%, sys=0.00%, ctx=50, majf=0, minf=87 00:27:36.480 IO depths : 1=8.2%, 2=22.1%, 4=55.7%, 8=13.9%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:36.480 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:36.480 complete : 0=0.0%, 4=87.8%, 8=12.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:36.480 issued rwts: total=0,552913,552913,0 short=0,0,0,0 dropped=0,0,0,0 00:27:36.480 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:36.480 00:27:36.480 Run status group 0 (all jobs): 00:27:36.480 WRITE: bw=216MiB/s (226MB/s), 216MiB/s-216MiB/s (226MB/s-226MB/s), io=2160MiB (2265MB), run=10001-10001msec 00:27:36.480 TRIM: bw=216MiB/s (226MB/s), 216MiB/s-216MiB/s (226MB/s-226MB/s), io=2160MiB (2265MB), run=10001-10001msec 00:27:36.480 00:27:36.480 real 0m13.406s 00:27:36.480 user 0m56.190s 00:27:36.480 sys 0m0.493s 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:27:36.480 ************************************ 00:27:36.480 END TEST bdev_fio_trim 00:27:36.480 ************************************ 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:27:36.480 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:27:36.480 00:27:36.480 real 0m27.111s 00:27:36.480 user 1m49.316s 00:27:36.480 sys 0m1.133s 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:36.480 ************************************ 00:27:36.480 END TEST bdev_fio 00:27:36.480 ************************************ 00:27:36.480 13:55:50 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:36.480 13:55:50 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:36.480 13:55:50 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:27:36.480 13:55:50 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:36.480 13:55:50 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:36.480 ************************************ 00:27:36.480 START TEST bdev_verify 00:27:36.480 ************************************ 00:27:36.480 13:55:50 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:36.481 [2024-06-10 13:55:50.524660] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:36.481 [2024-06-10 13:55:50.524720] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1723330 ] 00:27:36.481 [2024-06-10 13:55:50.615976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:36.481 [2024-06-10 13:55:50.713280] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:27:36.481 [2024-06-10 13:55:50.713482] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:36.481 [2024-06-10 13:55:50.734706] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:36.481 [2024-06-10 13:55:50.742736] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:36.481 [2024-06-10 13:55:50.750762] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:36.481 [2024-06-10 13:55:50.839408] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:39.024 [2024-06-10 13:55:52.984092] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:39.024 [2024-06-10 13:55:52.984154] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:39.024 [2024-06-10 13:55:52.984167] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.025 [2024-06-10 13:55:52.992108] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:39.025 [2024-06-10 13:55:52.992121] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:39.025 [2024-06-10 13:55:52.992131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.025 [2024-06-10 13:55:53.000130] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:39.025 [2024-06-10 13:55:53.000143] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:39.025 [2024-06-10 13:55:53.000150] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.025 [2024-06-10 13:55:53.008149] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:39.025 [2024-06-10 13:55:53.008165] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:39.025 [2024-06-10 13:55:53.008171] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.025 Running I/O for 5 seconds... 00:27:44.359 00:27:44.359 Latency(us) 00:27:44.359 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.359 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x0 length 0x1000 00:27:44.359 crypto_ram : 5.06 575.10 2.25 0.00 0.00 221677.38 4587.52 153791.15 00:27:44.359 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x1000 length 0x1000 00:27:44.359 crypto_ram : 5.06 581.62 2.27 0.00 0.00 219646.34 4751.36 152917.33 00:27:44.359 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x0 length 0x1000 00:27:44.359 crypto_ram2 : 5.06 578.07 2.26 0.00 0.00 220223.68 4614.83 140683.95 00:27:44.359 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x1000 length 0x1000 00:27:44.359 crypto_ram2 : 5.07 581.13 2.27 0.00 0.00 219140.72 4833.28 139810.13 00:27:44.359 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x0 length 0x1000 00:27:44.359 crypto_ram3 : 5.05 4486.95 17.53 0.00 0.00 28279.03 3986.77 22937.60 00:27:44.359 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x1000 length 0x1000 00:27:44.359 crypto_ram3 : 5.04 4517.53 17.65 0.00 0.00 28080.85 6007.47 22719.15 00:27:44.359 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x0 length 0x1000 00:27:44.359 crypto_ram4 : 5.05 4489.01 17.54 0.00 0.00 28214.00 4123.31 23156.05 00:27:44.359 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:44.359 Verification LBA range: start 0x1000 length 0x1000 00:27:44.359 crypto_ram4 : 5.05 4531.82 17.70 0.00 0.00 27948.53 3304.11 22173.01 00:27:44.359 =================================================================================================================== 00:27:44.359 Total : 20341.22 79.46 0.00 0.00 50052.17 3304.11 153791.15 00:27:44.359 00:27:44.359 real 0m7.937s 00:27:44.359 user 0m15.239s 00:27:44.359 sys 0m0.254s 00:27:44.359 13:55:58 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:44.359 13:55:58 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:27:44.359 ************************************ 00:27:44.359 END TEST bdev_verify 00:27:44.359 ************************************ 00:27:44.359 13:55:58 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:44.359 13:55:58 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:27:44.359 13:55:58 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:44.359 13:55:58 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:44.359 ************************************ 00:27:44.359 START TEST bdev_verify_big_io 00:27:44.360 ************************************ 00:27:44.360 13:55:58 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:44.360 [2024-06-10 13:55:58.537298] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:44.360 [2024-06-10 13:55:58.537343] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1724960 ] 00:27:44.360 [2024-06-10 13:55:58.624958] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:44.360 [2024-06-10 13:55:58.691555] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:27:44.360 [2024-06-10 13:55:58.691560] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:44.360 [2024-06-10 13:55:58.712678] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:44.360 [2024-06-10 13:55:58.720707] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:44.360 [2024-06-10 13:55:58.728724] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:44.360 [2024-06-10 13:55:58.813280] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:46.918 [2024-06-10 13:56:00.965082] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:46.918 [2024-06-10 13:56:00.965142] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:46.918 [2024-06-10 13:56:00.965151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:46.918 [2024-06-10 13:56:00.973094] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:46.918 [2024-06-10 13:56:00.973106] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:46.918 [2024-06-10 13:56:00.973113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:46.918 [2024-06-10 13:56:00.981114] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:46.918 [2024-06-10 13:56:00.981125] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:46.918 [2024-06-10 13:56:00.981131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:46.918 [2024-06-10 13:56:00.989136] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:46.918 [2024-06-10 13:56:00.989148] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:46.918 [2024-06-10 13:56:00.989153] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:46.918 Running I/O for 5 seconds... 00:27:47.492 [2024-06-10 13:56:01.855363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.855762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.855880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.855919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.855951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.856192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.856202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.492 [2024-06-10 13:56:01.857880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.858190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.858201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.859630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.859673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.859705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.859736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.860184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.860221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.860253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.860285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.860563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.860572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.861969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.862328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.862339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.864842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.865251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.865261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.866690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.867025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.867035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.867891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.867930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.867961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.867993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.868429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.868465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.868498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.868530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.868916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.868926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.870797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.871094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.871104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.872728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.873050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.873061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.873977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.874618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.875032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.875044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.493 [2024-06-10 13:56:01.876940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.877224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.877234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.878990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.879982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.880990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.881976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.882671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.883065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.883075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.884745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.885039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.885049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.885881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.885919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.885951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.885982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.886431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.886466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.886497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.886529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.886862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.886873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.888880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.889236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.889247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.890915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.891251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.891261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.494 [2024-06-10 13:56:01.892741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.892777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.893144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.893155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.894447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.894485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.894517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.894559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.895098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.895133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.895170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.895203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.895622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.895632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.896898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.896941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.896973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.897838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.898782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.898820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.898857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.898889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.899245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.899281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.899314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.899347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.899726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.899736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.900645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.900683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.900715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.900746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.901159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.901203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.901252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.901283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.901729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.901740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.902795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.902837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.902869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.902901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.903295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.903330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.903361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.903392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.903689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.903699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.904675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.904717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.904749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.904780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.905179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.905219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.905253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.905284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.905605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.905616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.906534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.906572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.906605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.906638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.907031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.907066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.907100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.907133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.907438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.907448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.909640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.910021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.910031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.910868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.910907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.910940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.910972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.911354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.911388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.911420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.911451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.495 [2024-06-10 13:56:01.911730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.911740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.912626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.912667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.912698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.912730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.913069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.913106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.913141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.913178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.913408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.913418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.914512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.914550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.914584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.914616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.914937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.914972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.915005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.915037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.915288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.915299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.916917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.918434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.918474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.919578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.919592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.920546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.922056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.923734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.924070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.925568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.925902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.926224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.928712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.930332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.931614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.932946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.934675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.935879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.936216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.936544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.938942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.939862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.941311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.942946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.944584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.945071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.945394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.946265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.948919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.950063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.951390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.952731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.954451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.954774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.955093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.956697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.958959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.960573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.962282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.963847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.964818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.496 [2024-06-10 13:56:01.965141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.759 [2024-06-10 13:56:01.965949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.759 [2024-06-10 13:56:01.967265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.759 [2024-06-10 13:56:01.969250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.759 [2024-06-10 13:56:01.970583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.759 [2024-06-10 13:56:01.971916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.973261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.973871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.974200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.975697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.977020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.979789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.981350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.982733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.984076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.984730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.985307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.986626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.988043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.990376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.991716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.993058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.994566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.995334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.996691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.998013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:01.999365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.001753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.003110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.004458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.005139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.006014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.007336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.008803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.010421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.012789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.014129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.015629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.015949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.017879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.019207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.020545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.021886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.024192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.025539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.026246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.026565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.028261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.029774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.031437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.033073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.035522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.037124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.037446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.037770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.039384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.040720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.042055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.043173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.045529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.046404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.046723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.047175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.049184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.050790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.052206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.053119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.055792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.056119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.056443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.057645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.059299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.060635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.061932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.063504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.065671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.065996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.066319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.067850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.069538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.070883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.071801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.073119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.074405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.074731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.075754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.077073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.078783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.080255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.081691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.082999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.084328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.084674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.086250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.087987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.089652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.760 [2024-06-10 13:56:02.090584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.091915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.093260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.094603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.095735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.097055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.098404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.100080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.101537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.102868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.104220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.106075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.107606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.109305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.110902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.112180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.113510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.114853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.116283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.118453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.119779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.121119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.122461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.124375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.126023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.127480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.128833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.131231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.132586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.134099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.135773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.137395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.138747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.140091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.141299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.143936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.145281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.146620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.147520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.149276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.150607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.151940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.152616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.155413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.157128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.158634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.159554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.161235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.162566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.164060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.164383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.166745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.168075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.169219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.170897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.172562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.173893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.174673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.174992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.177500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.178890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.179814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.181135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.182950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.184614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.184935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.185256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.187564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.188901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.190477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.191863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.193543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.194458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.194777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.195174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.197447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.198368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.199692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.201037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.203021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.203349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.203669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.761 [2024-06-10 13:56:02.204912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.207303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.208810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.210166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.211508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.212803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.213127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.213525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.214960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.216748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.218346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.220064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.221633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.222674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.222998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.223322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.224578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.226121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.226451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.226771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.227091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.227786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.228110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.228434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.228756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.230362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.230688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.231009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.231334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.232073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.232400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.232721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:47.762 [2024-06-10 13:56:02.233041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.234078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.234422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.234744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.234773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.235091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.235386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.238965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.240894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.240933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.240967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.240999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.241408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.241444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.241475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.241507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.242765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.242804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.242853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.242886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.243386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.243422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.243454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.243485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.244641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.244679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.244720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.244752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.245222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.245257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.245289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.245338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.246973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.247635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.248942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.248981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.249013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.249045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.249425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.249459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.249491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.249522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.250601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.250639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.250683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.250716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.251127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.251166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.251200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.251232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.252467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.252507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.252542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.252573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.253055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.253090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.253122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.253158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.254464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.254502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.254534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.254573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.255051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.255088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.255119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.255151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.256524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.256563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.256595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.256626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.256994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.257028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.257060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.257091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.258234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.258273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.258304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.258336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.258778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.025 [2024-06-10 13:56:02.258813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.258844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.258880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.260707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.262982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.264766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.265924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.265962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.265994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.266026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.266443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.266478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.266513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.266545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.267968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.268607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.270839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.271968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.272653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.273805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.273843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.273874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.273906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.274318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.274353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.274405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.274437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.276781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.278918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.280738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.281806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.281844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.281877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.026 [2024-06-10 13:56:02.281911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.282406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.282441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.282472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.282509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.283478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.283534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.283569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.283600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.283977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.284011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.284043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.284075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.285669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.286782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.286820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.286851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.286882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.287235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.287271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.287304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.287335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.288946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.289951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.289989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.290023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.290054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.290424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.290459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.290491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.290522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.291991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.292586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.293596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.293633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.293665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.293696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.294222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.294257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.294288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.294318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.295302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.295340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.295372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.296396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.296463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.296963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.297067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.297100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.297131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.297169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.297180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.297443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.298157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.299768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.301069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.301249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.301354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.301388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.301420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.301657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.302378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.303483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.303815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.304151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.304387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.304497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.306090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.307543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.309132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.309412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.311745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.313462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.027 [2024-06-10 13:56:02.313784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.314102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.314341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.315725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.317064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.318655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.319754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.319986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.322337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.323209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.323529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.323848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.324080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.325825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.327344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.328960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.329708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.329995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.332291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.332620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.332941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.334239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.334550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.335942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.337553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.338468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.340167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.340414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.341846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.342184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.342646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.344010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.344247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.345925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.347663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.348523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.349854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.350117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.351122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.351454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.352747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.354066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.354325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.355996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.356838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.358526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.360152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.360467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.361535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.362191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.363514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.364891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.365123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.366762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.367861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.369184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.370512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.370744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.371864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.373386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.374739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.376068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.376305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.377008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.378538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.380261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.381843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.382076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.383621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.384943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.386276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.387875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.388107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.389320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.390647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.391984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.393585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.393855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.396901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.398428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.399783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.401378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.401717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.403113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.404591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.406228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.407762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.408073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.410107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.411431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.413041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.414090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.414326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.415711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.417031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.418627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.419261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.028 [2024-06-10 13:56:02.419650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.422173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.423732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.425388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.426194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.426531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.427947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.429553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.430898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.431225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.431514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.433620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.435229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.436123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.437771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.438022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.439422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.441023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.441549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.441869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.442186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.444470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.446174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.447070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.448401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.448661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.450333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.451606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.451927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.452249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.452480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.454928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.455780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.457486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.459079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.459364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.461027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.461451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.461771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.462482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.462754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.465140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.466101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.467421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.468755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.468987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.470312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.470637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.470956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.472513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.472759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.474427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.476099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.477729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.479231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.479463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.479894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.480222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.481032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.482346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.482578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.484257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.485588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.486911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.488495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.488749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.489149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.489486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.491130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.492586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.492865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.495256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.029 [2024-06-10 13:56:02.496970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.292 [2024-06-10 13:56:02.498518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.500192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.500562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.500947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.501839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.503155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.504483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.504714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.506714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.508040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.509633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.510585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.511063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.511460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.513110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.514750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.516225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.516457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.518877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.520598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.522213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.522534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.522869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.524021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.525353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.526686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.528277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.528553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.530658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.532258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.532986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.533308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.533751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.535340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.537006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.538638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.540264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.540527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.542808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.544176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.544497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.544818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.545053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.546454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.547791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.549404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.550066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.550300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.552682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.553150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.553473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.554160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.554413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.555939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.557553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.559054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.560125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.560430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.562482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.562809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.563130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.564617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.564898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.566293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.567904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.568582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.570091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.570327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.571505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.571832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.572154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.573809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.574041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.574457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.576198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.577909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.578234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.578523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.580606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.582159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.583265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.584535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.584779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.586209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.586533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.586853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.587178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.587602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.588686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.589017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.293 [2024-06-10 13:56:02.589342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.589663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.589966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.590359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.590684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.591005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.591334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.591620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.593003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.593334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.593657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.593978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.594295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.594684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.595012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.595338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.595659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.595944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.597139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.597473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.597798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.598120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.598592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.598999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.599328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.599650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.599970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.600389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.601587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.601915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.602241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.602489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.602868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.603196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.603517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.603837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.604142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.605125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.605460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.605783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.605818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.606173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.606300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.606627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.606951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.607279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.607707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.608780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.609108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.609158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.609197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.609513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.609935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.609972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.610004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.610037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.610486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.611996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.612028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.612356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.613999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.614034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.614556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.615740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.615790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.615822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.615854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.616156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.616248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.616283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.616323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.616357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.616717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.617457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.617497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.617528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.617560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.617891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.617974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.294 [2024-06-10 13:56:02.618008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.618041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.618072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.618514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.619524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.619562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.619615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.619647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.619972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.620069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.620102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.620133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.620176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.620512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.621486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.621524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.621557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.621598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.621896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.621983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.622017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.622050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.622083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.622409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.623970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.624260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.625764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.626106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.627687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.628055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.628958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.628997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.629965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.631661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.632011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.632842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.632881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.632913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.632945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.633267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.633355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.633389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.633422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.633455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.633769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.635841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.295 [2024-06-10 13:56:02.636068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.636829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.636867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.636899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.636930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.637158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.637248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.637283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.637314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.637347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.637674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.638906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.639136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.640944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.642757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.643023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.643730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.643769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.643810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.643842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.644070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.644152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.644192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.644225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.644265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.644494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.645876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.646104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.646911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.646949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.646981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.647659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.648559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.648601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.648632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.648663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.648892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.648974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.649008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.649040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.649072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.649449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.650742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.296 [2024-06-10 13:56:02.651002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.651844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.651882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.651913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.651948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.652214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.652297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.652332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.652364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.652397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.652670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.653989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.654022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.654054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.654308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.655986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.656693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.656731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.656763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.656794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.657022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.657106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.657140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.657176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.657209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.657492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.297 [2024-06-10 13:56:02.658628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.658666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.658701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.658735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.658963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.659059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.659093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.659125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.659158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.659458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.660901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.661604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.661642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.661674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.661705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.662036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.662119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.662154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.662192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.662244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.662663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.663904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.664132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.666290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.666334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.666365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.667945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.668244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.668637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.668673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.668705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.668736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.669060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.671407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.671483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.671589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.672254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.673738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.675387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.675621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.676496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.676822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.677741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.679067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.679381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.681039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.682297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.683690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.685026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.685287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.686370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.686699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.688356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.690005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.690265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.691931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.298 [2024-06-10 13:56:02.692623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.693945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.695292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.695525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.696608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.697702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.699019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.700350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.700585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.701779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.703313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.704667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.706003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.706237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.707660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.709169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.710783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.712480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.712713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.713599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.714927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.716254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.717855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.718087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.720295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.721620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.722956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.724523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.724895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.726599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.728327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.729881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.731529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.732013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.734181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.735515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.737108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.738482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.738717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.740111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.741429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.742986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.743967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.744436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.746711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.748055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.749686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.750342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.750576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.752182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.753870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.755418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.755741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.756024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.758088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.759680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.760745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.762262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.762554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.299 [2024-06-10 13:56:02.763920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.765497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.766179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.766499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.766949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.769042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.770644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.771363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.772394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.772644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.774049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.775659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.776108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.776430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.776929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.779060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.780668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.781383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.782709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.782941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.784613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.786066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.786390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.786714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.786947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.789383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.790439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.792020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.793401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.793664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.795363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.795883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.796206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.796650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.564 [2024-06-10 13:56:02.796882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.799238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.800100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.801418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.802737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.802971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.804401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.804726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.805047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.806314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.806616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.808377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.809970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.811365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.812707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.812948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.813555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.813881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.814340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.815702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.815939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.817481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.818815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.820133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.821740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.821974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.822366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.822690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.824007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.825322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.825588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.828078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.829670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.831101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.832698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.832989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.833381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.833981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.835308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.836686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.836918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.839016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.840343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.841942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.842917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.843364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.843754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.845258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.846578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.847909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.848145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.850645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.852321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.854019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.854345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.854608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.855447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.856773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.858093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.859684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.859918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.862078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.863690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.864552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.864873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.865290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.866986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.868484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.869813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.871420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.871695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.874134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.875713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.876035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.876359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.876594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.877992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.879304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.880897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.882034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.882270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.884701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.885412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.885736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.886056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.886292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.887910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.889322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.890932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.891557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.891792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.894147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.565 [2024-06-10 13:56:02.894479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.894802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.895827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.896083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.897489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.899076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.900224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.901692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.901955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.903555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.903895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.904221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.905799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.906032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.907500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.908065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.909408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.910927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.911160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.912418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.913937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.914262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.915406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.915826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.916237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.916588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.918183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.919374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.919679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.921223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.921557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.921879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.922205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.922552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.922941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.923269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.923591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.923913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.924237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.925769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.926097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.926440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.926770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.927097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.927491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.927816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.928137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.928462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.928760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.930154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.930492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.930814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.931153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.931591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.931978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.932306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.932627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.932947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.933252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.934742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.935072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.935397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.935719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.936220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.936645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.936971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.937296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.937622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.937963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.939182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.939515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.939837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.940181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.940503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.940895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.941224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.941546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.941867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.942225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.943412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.943745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.944068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.944394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.944738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.945128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.945458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.945780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.946110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.946482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.947728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.948059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.948386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.948709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.566 [2024-06-10 13:56:02.948994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.949408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.949734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.950054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.950379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.950798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.952070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.952404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.952729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.952763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.953123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.953233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.953555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.953877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.954201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.954529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.955654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.955985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.956870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.957211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.958457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.958501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.958533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.958565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.958880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.959018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.959052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.959085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.959118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.959447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.960405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.960444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.960476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.960507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.960967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.961052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.961093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.961126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.961157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.961500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.962689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.962727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.962760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.962803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.963081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.963172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.963207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.963239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.963271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.963541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.964424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.964462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.964493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.964525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.964885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.964971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.965006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.965038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.965070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.965542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.966388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.966426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.966457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.966507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.966935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.967037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.967073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.967105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.967137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.967482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.968543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.968582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.968614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.968648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.968936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.969024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.969058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.969090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.969123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.969390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.567 [2024-06-10 13:56:02.970954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.970987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.971021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.971258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.972954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.973735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.973774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.973806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.973837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.974072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.974160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.974199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.974231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.974263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.974495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.975547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.975588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.975620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.975651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.975962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.976049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.976083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.976115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.976147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.976382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.977968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.978202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.979855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.981789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.982051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.982911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.982949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.982981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.983774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.984556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.984595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.984627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.984659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.985062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.985155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.985193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.985226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.985258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.985559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.568 [2024-06-10 13:56:02.986950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.987184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.988865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.989745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.989783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.569 [2024-06-10 13:56:02.989817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.989848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.990080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.990176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.990210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.990242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.990289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.990518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.991792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.992021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.992810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.992850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.992884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.992916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.993274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.993362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.993396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.993427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.993475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.993867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.994632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.994671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.994702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.994735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.994963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.995049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.995085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.995118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.995149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.995496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.996817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.997052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.997999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.998779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.999554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.999593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.999628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.999681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.999910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:02.999997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.000032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.000068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.000100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.000334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.001847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.002209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.002985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.003854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.004619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.004658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.004689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.004720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.004948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.570 [2024-06-10 13:56:03.005032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.005065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.005100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.005132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.005430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.007878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.008637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.008675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.008706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.008738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.008966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.009061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.009094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.009125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.009157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.009438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.010883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.011211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.011957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.011995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.012027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.013615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.013941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.015545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.015581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.015613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.015644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.015873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.016630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.016668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.016986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.017313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.017549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.017652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.018491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.020100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.020773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.022090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.022326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.022439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.024036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.025480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.025801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.026115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.028243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.029841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.030883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.032472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.032839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.034235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.571 [2024-06-10 13:56:03.035840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.036440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.036762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.037217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.039504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.041188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.042032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.043338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.043580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.045242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.046616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.046937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.047261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.047496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.049870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.050922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.052520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.053940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.054206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.055863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.056443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.056765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.057086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.057322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.059646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.060420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.061747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.063087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.063325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.064806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.065132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.065456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.066639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.066900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.068722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.070291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.071703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.073011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.073247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.073967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.074296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.074620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.076144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.076381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.077790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.079137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.080470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.082063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.082298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.082685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.083009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.084059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.085405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.085662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.835 [2024-06-10 13:56:03.087919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.089352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.090675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.092277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.092664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.093052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.093407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.094894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.096533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.096766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.098885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.100218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.101813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.103024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.103383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.103770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.105008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.106332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.107666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.107899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.110239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.111643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.113245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.113566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.113848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.114381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.115725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.117254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.118940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.119177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.121244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.122829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.124278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.124609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.124895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.126030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.127291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.128627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.130229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.130476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.132675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.134284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.134796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.135117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.135475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.137062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.138762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.140375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.142100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.142435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.144757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.146168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.146490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.146811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.147047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.148459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.149785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.151388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.152326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.152559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.154940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.155434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.155755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.156167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.156404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.158184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.159854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.161587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.162507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.162770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.164696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.165037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.165364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.166810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.167063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.168479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.170071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.170715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.172238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.172472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.173534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.173866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.174571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.175875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.176109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.177759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.179170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.180254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.181550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.181801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.182891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.183227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.184694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.836 [2024-06-10 13:56:03.186007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.186293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.187950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.188630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.190170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.191896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.192132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.193274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.193963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.195283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.196609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.196843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.198418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.199575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.200895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.202217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.202450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.203925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.205495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.206898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.208229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.208461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.209188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.210812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.212500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.214030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.214268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.215813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.217157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.218490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.219940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.220256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.221674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.223007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.224119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.224449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.224759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.226070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.226405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.227151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.228485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.228727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.230389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.231769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.232472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.233801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.234036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.235115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.235450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.235772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.236100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.236394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.236790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.237115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.237443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.237768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.238166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.239519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.239849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.240175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.240498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.240869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.241271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.241600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.241923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.242252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.242609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.243882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.244221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.244544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.244866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.245192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.245615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.245943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.246272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.246597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.246967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.248248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.248578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.248901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.249228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.249560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.249956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.250287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.250612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.250934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.251389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.252598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.252928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.253255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.253578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.837 [2024-06-10 13:56:03.253888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.254286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.254612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.254942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.255269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.255592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.256928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.257263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.257590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.257911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.258239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.259958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.260290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.262030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.262370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.262649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.263906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.265417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.265744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.266067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.266355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.267417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.268235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.269375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.270042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.270339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.272333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.273317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.274176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.274498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.275050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.275721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.276964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.277683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.278804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.279114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.281626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.282143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.283457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.283779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.284102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.284529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.286273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.286616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.288246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.288671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.291073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.291410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.293100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.293154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.293565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.293959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.294289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.295656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.296124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.296365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.297599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.298521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.298557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.298589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.298857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.298966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.300394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.300793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.301118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.301466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.303573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.303614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.303647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.303687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.304042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.304444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.304481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.304513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.304545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.304845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.306974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.307007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.307040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.307297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.308260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.308298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.308330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:48.838 [2024-06-10 13:56:03.308361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.308812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.308901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.308952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.308984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.309020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.309256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.310931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.311308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.312730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.313092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.314963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.100 [2024-06-10 13:56:03.315744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.315785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.315816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.315848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.316217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.316305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.316338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.316370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.316401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.316630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.317998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.318290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.319991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.320859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.320898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.320931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.320963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.321318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.321407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.321440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.321472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.321504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.321812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.322612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.322651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.322683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.322714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.323072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.323156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.323205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.323237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.323269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.323499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.324343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.324381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.324418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.324450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.324879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.324969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.325001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.325032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.325064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.325320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.326996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.327629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.327668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.327699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.327731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.327973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.328057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.328090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.328141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.328179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.328575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.329743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.329782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.329813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.329845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.330159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.330254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.330289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.330321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.330353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.330583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.333788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.333834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.518838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.101 [2024-06-10 13:56:03.520430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.520471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.521095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.523604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.523934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.524260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.525182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.526814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.528408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.529718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.531054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.533278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.533607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.533930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.535544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.537207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.538795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.539463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.540934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.542213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.542544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.543437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.544768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.546724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.547960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.549414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.550745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.550758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.551022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.552129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.552463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.554048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.555748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.557644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.558264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.559572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.561033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.561266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.562349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.563249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.564573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.565897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.567597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.568809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.570151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.571478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.571711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.572859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.102 [2024-06-10 13:56:03.574496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.575970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.576309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.577984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.578387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.578708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.579029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.579368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.581444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.581774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.582097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.583714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.585382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.586983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.587669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.589243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.589475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.590731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.591062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.591097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.591840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.592474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.594028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.594371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.594712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.595110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.596267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.596328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.596646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.596682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.597062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.597391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.597445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.597765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.598198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.599513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.599552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.599876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.599910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.600345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.600669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.600707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.601030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.601359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.366 [2024-06-10 13:56:03.602586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.602626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.602947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.602983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.603443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.603767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.603801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.604121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.604457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.605727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.605766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.606084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.606119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.606526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.606850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.606885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.607211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.607533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.608737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.608777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.609096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.609130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.609516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.609841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.609875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.610202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.610551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.611683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.611726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.612047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.612081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.612454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.612802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.612838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.613173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.613430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.614716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.614756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.615075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.615110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.615552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.615875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.615910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.616235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.616583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.617750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.617792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.618110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.618145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.618572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.618897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.618933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.619259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.619685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.621025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.621068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.621416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.621452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.622003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.622338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.622373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.622691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.623089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.624748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.624789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.625109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.625144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.625686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.626011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.626045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.626372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.626770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.627930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.627973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.628298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.628334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.628772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.629095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.629130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.629456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.629883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.630980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.631042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.631369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.631405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.631785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.632107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.632142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.632468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.632829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.633867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.633908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.367 [2024-06-10 13:56:03.634237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.634273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.634695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.635018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.635053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.635380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.635758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.636827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.636868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.637200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.637234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.637649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.637973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.638008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.638346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.638673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.640792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.640832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.642159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.642197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.642521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.643539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.643575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.645267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.645526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.647046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.647088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.647414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.647456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.647882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.648810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.648847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.649805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.650059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.651246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.651288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.652782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.652818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.653256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.654216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.654251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.654571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.654933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.656015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.656064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.657689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.657723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.658091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.658419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.658454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.658773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.659008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.661344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.661385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.662314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.662348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.662671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.664065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.664107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.665734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.665967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.667793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.667835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.669174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.669209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.669637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.671224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.671259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.671935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.672171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.674485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.674526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.674846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.674880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.675297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.676536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.676571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.677901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.678151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.680448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.680489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.682110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.682145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.682156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.682389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.682493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.684160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.684199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.684519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.684927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.685746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.687084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.368 [2024-06-10 13:56:03.687120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.688692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.689010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.689116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.690521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.690556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.691960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.692196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.692959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.692997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.693029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.693061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.693489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.693574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.695262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.695297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.695329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.695559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.696823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.697103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.697918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.697957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.697988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.698958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.699723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.699762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.699794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.699826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.700055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.700139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.700179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.700211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.701666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.701942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.702768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.702806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.702838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.702869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.703097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.703186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.703221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.703252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.703285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.703796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.704617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.704655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.704687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.704718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.705000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.705085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.705118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.705151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.705189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.705431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.706826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.707089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.707891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.707929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.707961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.707993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.708310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.708394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.708428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.708461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.708495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.708722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.709660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.711236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.714331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.714394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.714694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.369 [2024-06-10 13:56:03.719837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.719901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.721469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.726534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.726597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.728157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.728199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.729755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.729793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.731116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.731151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.732242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.732549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.732584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.733208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.733242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.733548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.734975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.735012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.736604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.736639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.738862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.738902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.740448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.740482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.740818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.741224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.741261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.741903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.741938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.744577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.744618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.745692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.745727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.746003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.747379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.747415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.748999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.749044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.752442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.752484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.754010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.754054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.754287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.755926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.755963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.757206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.757241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.759343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.759403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.759722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.759755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.760150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.761737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.761774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.763273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.763308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.765603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.765643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.766966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.767001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.767236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.768474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.768510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.768837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.768870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.770878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.770920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.772419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.772453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.772963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.774654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.774691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.776284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.776318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.777678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.777719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.778757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.778791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.779058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.780461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.780497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.782095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.782130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.784408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.784448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.786047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.370 [2024-06-10 13:56:03.786086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.786431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.786820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.786855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.788008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.788043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.790094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.790136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.791831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.791865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.792097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.793631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.793668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.795358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.795393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.797844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.797884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.799216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.799250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.799479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.800535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.800573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.802206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.802240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.803703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.803743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.804062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.804097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.804338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.805747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.805784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.807109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.807143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.809377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.809418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.810746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.810781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.811011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.811513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.811549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.811869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.811904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.814467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.814507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.815701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.815736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.815968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.817399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.817435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.818782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.818816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.820625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.820666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.821973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.822007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.822247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.823909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.823946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.825106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.825141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.827806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.827851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.828565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.828599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.829025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.829602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.829639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.830968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.831003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.832767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.832809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.834142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.834180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.834464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.836102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.836138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.837004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.371 [2024-06-10 13:56:03.837038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.840242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.840285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.841844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.841880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.842110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.842902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.842939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.844259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.844293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.845569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.633 [2024-06-10 13:56:03.845610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.845930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.847503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.847794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.849201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.849238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.850826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.850859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.853400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.853459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.855019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.855260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.855648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.855683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.856750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.856784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.857752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.858848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.860396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.861783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.862060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.863731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.865604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.866931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.868258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.869836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.870102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.870208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.871907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.873435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.874799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.876503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.877937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.879539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.881254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.881492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.882366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.883689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.885019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.886604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.888876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.890208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.891530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.892467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.892701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.894093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.895414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.896148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.896473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.898125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.898458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.899869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.901200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.901454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.903145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.903817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.906318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.907450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.907772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.908111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.908347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.908447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.909568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.911199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.911767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.913451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.913779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.914099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.915754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.916030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.916630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.918102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.919350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.919951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.922109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.922459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.922782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.923103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.923432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.923813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.924135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.924461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.924784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.926201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.926530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.926564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.926884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.927228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.927620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.927946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.928273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.928597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.634 [2024-06-10 13:56:03.930083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.930434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.930487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.930806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.931140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.931534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.931549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.931868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.932194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.933371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.933698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.933732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.934051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.934374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.934760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.934796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.935116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.935151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.936346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.936672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.936732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.937052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.937349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.937733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.937769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.938088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.938125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.939749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.939790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.940110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.940145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.940456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.940843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.940879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.941208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.941243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.943104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.943145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.943484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.943519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.943838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.944228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.944264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.944583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.944618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.946637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.946711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.947030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.947073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.947413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.947802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.947859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.948184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.948220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.949907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.949947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.950271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.950307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.950650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.951035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.951070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.951397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.951432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.953026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.953066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.953399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.953434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.953753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.954136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.954179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.954501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.954536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.956024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.956064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.956389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.956424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.956786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.957175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.957211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.957529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.957565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.959129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.959174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.959495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.959539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.959889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.960279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.960316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.960636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.960670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.962211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.962253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.635 [2024-06-10 13:56:03.962573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.636 [2024-06-10 13:56:03.962607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:49.636 [2024-06-10 13:56:03.962842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:52.936 00:27:52.936 Latency(us) 00:27:52.936 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:52.936 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x0 length 0x100 00:27:52.936 crypto_ram : 5.73 44.70 2.79 0.00 0.00 2755022.51 92187.31 2292886.19 00:27:52.936 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x100 length 0x100 00:27:52.936 crypto_ram : 5.74 44.63 2.79 0.00 0.00 2770498.56 73400.32 2418715.31 00:27:52.936 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x0 length 0x100 00:27:52.936 crypto_ram2 : 5.74 47.20 2.95 0.00 0.00 2547721.37 7318.19 2278905.17 00:27:52.936 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x100 length 0x100 00:27:52.936 crypto_ram2 : 5.74 44.62 2.79 0.00 0.00 2670375.25 72963.41 2390753.28 00:27:52.936 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x0 length 0x100 00:27:52.936 crypto_ram3 : 5.56 315.08 19.69 0.00 0.00 363862.56 45001.39 548754.77 00:27:52.936 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x100 length 0x100 00:27:52.936 crypto_ram3 : 5.56 306.01 19.13 0.00 0.00 374092.92 52210.35 559240.53 00:27:52.936 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x0 length 0x100 00:27:52.936 crypto_ram4 : 5.67 331.32 20.71 0.00 0.00 336870.01 17039.36 450887.68 00:27:52.936 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:52.936 Verification LBA range: start 0x100 length 0x100 00:27:52.936 crypto_ram4 : 5.67 323.51 20.22 0.00 0.00 344221.36 12288.00 492830.72 00:27:52.936 =================================================================================================================== 00:27:52.936 Total : 1457.06 91.07 0.00 0.00 649164.73 7318.19 2418715.31 00:27:52.936 00:27:52.936 real 0m8.583s 00:27:52.936 user 0m16.548s 00:27:52.936 sys 0m0.269s 00:27:52.936 13:56:07 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:52.936 13:56:07 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:27:52.936 ************************************ 00:27:52.936 END TEST bdev_verify_big_io 00:27:52.936 ************************************ 00:27:52.936 13:56:07 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:52.936 13:56:07 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:27:52.936 13:56:07 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:52.936 13:56:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:52.936 ************************************ 00:27:52.936 START TEST bdev_write_zeroes 00:27:52.936 ************************************ 00:27:52.936 13:56:07 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:52.936 [2024-06-10 13:56:07.195988] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:52.936 [2024-06-10 13:56:07.196035] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1726571 ] 00:27:52.936 [2024-06-10 13:56:07.288641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:52.936 [2024-06-10 13:56:07.364211] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:52.936 [2024-06-10 13:56:07.385294] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:52.936 [2024-06-10 13:56:07.393323] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:52.936 [2024-06-10 13:56:07.401338] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:53.197 [2024-06-10 13:56:07.486973] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:55.739 [2024-06-10 13:56:09.632002] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:55.739 [2024-06-10 13:56:09.632058] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:55.739 [2024-06-10 13:56:09.632066] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.739 [2024-06-10 13:56:09.640021] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:55.739 [2024-06-10 13:56:09.640034] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:55.739 [2024-06-10 13:56:09.640040] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.739 [2024-06-10 13:56:09.648039] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:55.739 [2024-06-10 13:56:09.648050] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:55.739 [2024-06-10 13:56:09.648056] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.739 [2024-06-10 13:56:09.656059] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:55.739 [2024-06-10 13:56:09.656070] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:55.739 [2024-06-10 13:56:09.656076] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:55.739 Running I/O for 1 seconds... 00:27:56.309 00:27:56.309 Latency(us) 00:27:56.309 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:56.309 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:56.309 crypto_ram : 1.02 2171.53 8.48 0.00 0.00 58535.48 5024.43 69468.16 00:27:56.309 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:56.309 crypto_ram2 : 1.02 2177.25 8.50 0.00 0.00 58093.72 5024.43 64662.19 00:27:56.309 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:56.309 crypto_ram3 : 1.02 16762.49 65.48 0.00 0.00 7529.71 2280.11 9557.33 00:27:56.309 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:56.309 crypto_ram4 : 1.02 16799.76 65.62 0.00 0.00 7492.45 2293.76 8738.13 00:27:56.309 =================================================================================================================== 00:27:56.309 Total : 37911.03 148.09 0.00 0.00 13361.12 2280.11 69468.16 00:27:56.570 00:27:56.570 real 0m3.840s 00:27:56.570 user 0m3.572s 00:27:56.570 sys 0m0.233s 00:27:56.570 13:56:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:56.570 13:56:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:27:56.570 ************************************ 00:27:56.570 END TEST bdev_write_zeroes 00:27:56.570 ************************************ 00:27:56.570 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:56.570 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:27:56.570 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:56.570 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:56.830 ************************************ 00:27:56.830 START TEST bdev_json_nonenclosed 00:27:56.830 ************************************ 00:27:56.830 13:56:11 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:56.830 [2024-06-10 13:56:11.109611] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:56.830 [2024-06-10 13:56:11.109658] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727312 ] 00:27:56.830 [2024-06-10 13:56:11.197688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:56.830 [2024-06-10 13:56:11.266832] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:56.830 [2024-06-10 13:56:11.266885] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:27:56.830 [2024-06-10 13:56:11.266898] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:56.830 [2024-06-10 13:56:11.266905] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:57.091 00:27:57.091 real 0m0.270s 00:27:57.091 user 0m0.165s 00:27:57.091 sys 0m0.103s 00:27:57.091 13:56:11 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:57.091 13:56:11 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:27:57.091 ************************************ 00:27:57.091 END TEST bdev_json_nonenclosed 00:27:57.091 ************************************ 00:27:57.091 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:57.091 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:27:57.091 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:57.091 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:57.091 ************************************ 00:27:57.091 START TEST bdev_json_nonarray 00:27:57.091 ************************************ 00:27:57.091 13:56:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:57.091 [2024-06-10 13:56:11.452617] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:57.091 [2024-06-10 13:56:11.452664] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727335 ] 00:27:57.091 [2024-06-10 13:56:11.542391] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:57.352 [2024-06-10 13:56:11.619245] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:57.352 [2024-06-10 13:56:11.619305] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:27:57.352 [2024-06-10 13:56:11.619317] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:57.352 [2024-06-10 13:56:11.619324] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:57.352 00:27:57.352 real 0m0.283s 00:27:57.352 user 0m0.170s 00:27:57.352 sys 0m0.111s 00:27:57.352 13:56:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:57.352 13:56:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:27:57.352 ************************************ 00:27:57.352 END TEST bdev_json_nonarray 00:27:57.352 ************************************ 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:27:57.352 13:56:11 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:27:57.352 00:27:57.352 real 1m7.524s 00:27:57.352 user 2m53.537s 00:27:57.352 sys 0m6.098s 00:27:57.352 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:57.352 13:56:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:57.352 ************************************ 00:27:57.352 END TEST blockdev_crypto_aesni 00:27:57.352 ************************************ 00:27:57.352 13:56:11 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:27:57.352 13:56:11 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:57.352 13:56:11 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:57.352 13:56:11 -- common/autotest_common.sh@10 -- # set +x 00:27:57.352 ************************************ 00:27:57.352 START TEST blockdev_crypto_sw 00:27:57.352 ************************************ 00:27:57.352 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:27:57.614 * Looking for test storage... 00:27:57.614 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1727471 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1727471 00:27:57.614 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@830 -- # '[' -z 1727471 ']' 00:27:57.614 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:57.614 13:56:11 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:27:57.614 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:57.614 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:57.614 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:57.614 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:57.614 13:56:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:57.614 [2024-06-10 13:56:11.989359] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:57.614 [2024-06-10 13:56:11.989427] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727471 ] 00:27:57.614 [2024-06-10 13:56:12.082630] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:57.873 [2024-06-10 13:56:12.153368] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:58.443 13:56:12 blockdev_crypto_sw -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:58.443 13:56:12 blockdev_crypto_sw -- common/autotest_common.sh@863 -- # return 0 00:27:58.443 13:56:12 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:27:58.443 13:56:12 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:27:58.443 13:56:12 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:27:58.443 13:56:12 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:58.443 13:56:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:58.702 Malloc0 00:27:58.702 Malloc1 00:27:58.702 true 00:27:58.702 true 00:27:58.702 true 00:27:58.702 [2024-06-10 13:56:13.037298] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:58.702 crypto_ram 00:27:58.702 [2024-06-10 13:56:13.045324] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:58.702 crypto_ram2 00:27:58.702 [2024-06-10 13:56:13.053345] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:58.702 crypto_ram3 00:27:58.702 [ 00:27:58.702 { 00:27:58.702 "name": "Malloc1", 00:27:58.702 "aliases": [ 00:27:58.702 "011e85e5-3163-44f9-bbe9-9f518d45e236" 00:27:58.702 ], 00:27:58.702 "product_name": "Malloc disk", 00:27:58.702 "block_size": 4096, 00:27:58.702 "num_blocks": 4096, 00:27:58.702 "uuid": "011e85e5-3163-44f9-bbe9-9f518d45e236", 00:27:58.702 "assigned_rate_limits": { 00:27:58.702 "rw_ios_per_sec": 0, 00:27:58.702 "rw_mbytes_per_sec": 0, 00:27:58.702 "r_mbytes_per_sec": 0, 00:27:58.702 "w_mbytes_per_sec": 0 00:27:58.702 }, 00:27:58.702 "claimed": true, 00:27:58.702 "claim_type": "exclusive_write", 00:27:58.702 "zoned": false, 00:27:58.702 "supported_io_types": { 00:27:58.702 "read": true, 00:27:58.702 "write": true, 00:27:58.702 "unmap": true, 00:27:58.702 "write_zeroes": true, 00:27:58.702 "flush": true, 00:27:58.702 "reset": true, 00:27:58.702 "compare": false, 00:27:58.702 "compare_and_write": false, 00:27:58.702 "abort": true, 00:27:58.702 "nvme_admin": false, 00:27:58.702 "nvme_io": false 00:27:58.702 }, 00:27:58.702 "memory_domains": [ 00:27:58.702 { 00:27:58.702 "dma_device_id": "system", 00:27:58.702 "dma_device_type": 1 00:27:58.702 }, 00:27:58.703 { 00:27:58.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:58.703 "dma_device_type": 2 00:27:58.703 } 00:27:58.703 ], 00:27:58.703 "driver_specific": {} 00:27:58.703 } 00:27:58.703 ] 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:27:58.703 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:27:58.703 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "3e1da837-a0ef-5681-be10-c2ae88646c91"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "3e1da837-a0ef-5681-be10-c2ae88646c91",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ad405884-3d5e-512d-9cac-fbbefca04687"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ad405884-3d5e-512d-9cac-fbbefca04687",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:27:58.964 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 1727471 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@949 -- # '[' -z 1727471 ']' 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # kill -0 1727471 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # uname 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1727471 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1727471' 00:27:58.964 killing process with pid 1727471 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # kill 1727471 00:27:58.964 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@973 -- # wait 1727471 00:27:59.226 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:59.226 13:56:13 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:59.226 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:27:59.226 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:59.226 13:56:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:59.226 ************************************ 00:27:59.226 START TEST bdev_hello_world 00:27:59.226 ************************************ 00:27:59.226 13:56:13 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:59.226 [2024-06-10 13:56:13.602998] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:59.226 [2024-06-10 13:56:13.603046] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1727764 ] 00:27:59.226 [2024-06-10 13:56:13.691693] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:59.487 [2024-06-10 13:56:13.761996] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:27:59.487 [2024-06-10 13:56:13.911260] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:59.487 [2024-06-10 13:56:13.911312] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:59.487 [2024-06-10 13:56:13.911321] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.487 [2024-06-10 13:56:13.919278] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:59.487 [2024-06-10 13:56:13.919290] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:59.487 [2024-06-10 13:56:13.919297] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.487 [2024-06-10 13:56:13.927298] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:59.487 [2024-06-10 13:56:13.927309] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:59.487 [2024-06-10 13:56:13.927315] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:59.749 [2024-06-10 13:56:13.964395] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:59.749 [2024-06-10 13:56:13.964418] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:59.749 [2024-06-10 13:56:13.964429] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:59.749 [2024-06-10 13:56:13.965814] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:59.749 [2024-06-10 13:56:13.965872] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:59.749 [2024-06-10 13:56:13.965881] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:59.749 [2024-06-10 13:56:13.965910] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:59.749 00:27:59.749 [2024-06-10 13:56:13.965920] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:59.749 00:27:59.749 real 0m0.540s 00:27:59.749 user 0m0.375s 00:27:59.749 sys 0m0.144s 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:59.749 ************************************ 00:27:59.749 END TEST bdev_hello_world 00:27:59.749 ************************************ 00:27:59.749 13:56:14 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:27:59.749 13:56:14 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:27:59.749 13:56:14 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:59.749 13:56:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:59.749 ************************************ 00:27:59.749 START TEST bdev_bounds 00:27:59.749 ************************************ 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1728048 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1728048' 00:27:59.749 Process bdevio pid: 1728048 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1728048 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1728048 ']' 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:59.749 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:59.749 13:56:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:59.749 [2024-06-10 13:56:14.219016] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:27:59.749 [2024-06-10 13:56:14.219066] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1728048 ] 00:28:00.010 [2024-06-10 13:56:14.311306] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:00.010 [2024-06-10 13:56:14.389098] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:28:00.010 [2024-06-10 13:56:14.389231] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:28:00.010 [2024-06-10 13:56:14.389390] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:00.271 [2024-06-10 13:56:14.530399] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:00.271 [2024-06-10 13:56:14.530455] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:00.271 [2024-06-10 13:56:14.530464] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:00.271 [2024-06-10 13:56:14.538419] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:00.271 [2024-06-10 13:56:14.538431] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:00.271 [2024-06-10 13:56:14.538437] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:00.271 [2024-06-10 13:56:14.546440] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:00.271 [2024-06-10 13:56:14.546456] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:00.271 [2024-06-10 13:56:14.546462] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:00.840 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:00.840 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:28:00.840 13:56:15 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:00.840 I/O targets: 00:28:00.840 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:28:00.840 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:28:00.840 00:28:00.840 00:28:00.840 CUnit - A unit testing framework for C - Version 2.1-3 00:28:00.840 http://cunit.sourceforge.net/ 00:28:00.840 00:28:00.840 00:28:00.840 Suite: bdevio tests on: crypto_ram3 00:28:00.840 Test: blockdev write read block ...passed 00:28:00.840 Test: blockdev write zeroes read block ...passed 00:28:00.840 Test: blockdev write zeroes read no split ...passed 00:28:00.841 Test: blockdev write zeroes read split ...passed 00:28:00.841 Test: blockdev write zeroes read split partial ...passed 00:28:00.841 Test: blockdev reset ...passed 00:28:00.841 Test: blockdev write read 8 blocks ...passed 00:28:00.841 Test: blockdev write read size > 128k ...passed 00:28:00.841 Test: blockdev write read invalid size ...passed 00:28:00.841 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:00.841 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:00.841 Test: blockdev write read max offset ...passed 00:28:00.841 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:00.841 Test: blockdev writev readv 8 blocks ...passed 00:28:00.841 Test: blockdev writev readv 30 x 1block ...passed 00:28:00.841 Test: blockdev writev readv block ...passed 00:28:00.841 Test: blockdev writev readv size > 128k ...passed 00:28:00.841 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:00.841 Test: blockdev comparev and writev ...passed 00:28:00.841 Test: blockdev nvme passthru rw ...passed 00:28:00.841 Test: blockdev nvme passthru vendor specific ...passed 00:28:00.841 Test: blockdev nvme admin passthru ...passed 00:28:00.841 Test: blockdev copy ...passed 00:28:00.841 Suite: bdevio tests on: crypto_ram 00:28:00.841 Test: blockdev write read block ...passed 00:28:00.841 Test: blockdev write zeroes read block ...passed 00:28:00.841 Test: blockdev write zeroes read no split ...passed 00:28:00.841 Test: blockdev write zeroes read split ...passed 00:28:00.841 Test: blockdev write zeroes read split partial ...passed 00:28:00.841 Test: blockdev reset ...passed 00:28:00.841 Test: blockdev write read 8 blocks ...passed 00:28:00.841 Test: blockdev write read size > 128k ...passed 00:28:00.841 Test: blockdev write read invalid size ...passed 00:28:00.841 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:00.841 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:00.841 Test: blockdev write read max offset ...passed 00:28:00.841 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:00.841 Test: blockdev writev readv 8 blocks ...passed 00:28:00.841 Test: blockdev writev readv 30 x 1block ...passed 00:28:00.841 Test: blockdev writev readv block ...passed 00:28:00.841 Test: blockdev writev readv size > 128k ...passed 00:28:00.841 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:00.841 Test: blockdev comparev and writev ...passed 00:28:00.841 Test: blockdev nvme passthru rw ...passed 00:28:00.841 Test: blockdev nvme passthru vendor specific ...passed 00:28:00.841 Test: blockdev nvme admin passthru ...passed 00:28:00.841 Test: blockdev copy ...passed 00:28:00.841 00:28:00.841 Run Summary: Type Total Ran Passed Failed Inactive 00:28:00.841 suites 2 2 n/a 0 0 00:28:00.841 tests 46 46 46 0 0 00:28:00.841 asserts 260 260 260 0 n/a 00:28:00.841 00:28:00.841 Elapsed time = 0.060 seconds 00:28:00.841 0 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1728048 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1728048 ']' 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1728048 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1728048 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1728048' 00:28:00.841 killing process with pid 1728048 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1728048 00:28:00.841 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1728048 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:01.102 00:28:01.102 real 0m1.240s 00:28:01.102 user 0m3.402s 00:28:01.102 sys 0m0.270s 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:01.102 ************************************ 00:28:01.102 END TEST bdev_bounds 00:28:01.102 ************************************ 00:28:01.102 13:56:15 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:28:01.102 13:56:15 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:01.102 13:56:15 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:01.102 13:56:15 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:01.102 ************************************ 00:28:01.102 START TEST bdev_nbd 00:28:01.102 ************************************ 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1728217 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1728217 /var/tmp/spdk-nbd.sock 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1728217 ']' 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:01.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:01.102 13:56:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:01.102 [2024-06-10 13:56:15.541443] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:01.102 [2024-06-10 13:56:15.541493] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:01.362 [2024-06-10 13:56:15.634218] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:01.362 [2024-06-10 13:56:15.714974] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:01.622 [2024-06-10 13:56:15.859097] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:01.622 [2024-06-10 13:56:15.859144] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:01.622 [2024-06-10 13:56:15.859152] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:01.622 [2024-06-10 13:56:15.867113] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:01.622 [2024-06-10 13:56:15.867125] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:01.622 [2024-06-10 13:56:15.867131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:01.622 [2024-06-10 13:56:15.875134] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:01.622 [2024-06-10 13:56:15.875145] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:01.622 [2024-06-10 13:56:15.875151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.191 1+0 records in 00:28:02.191 1+0 records out 00:28:02.191 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277647 s, 14.8 MB/s 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:28:02.191 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:28:02.450 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:28:02.450 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.451 1+0 records in 00:28:02.451 1+0 records out 00:28:02.451 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254376 s, 16.1 MB/s 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:28:02.451 13:56:16 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:02.710 { 00:28:02.710 "nbd_device": "/dev/nbd0", 00:28:02.710 "bdev_name": "crypto_ram" 00:28:02.710 }, 00:28:02.710 { 00:28:02.710 "nbd_device": "/dev/nbd1", 00:28:02.710 "bdev_name": "crypto_ram3" 00:28:02.710 } 00:28:02.710 ]' 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:02.710 { 00:28:02.710 "nbd_device": "/dev/nbd0", 00:28:02.710 "bdev_name": "crypto_ram" 00:28:02.710 }, 00:28:02.710 { 00:28:02.710 "nbd_device": "/dev/nbd1", 00:28:02.710 "bdev_name": "crypto_ram3" 00:28:02.710 } 00:28:02.710 ]' 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:02.710 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:02.970 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:03.230 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:03.490 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:03.491 13:56:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:28:03.751 /dev/nbd0 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:03.751 1+0 records in 00:28:03.751 1+0 records out 00:28:03.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267244 s, 15.3 MB/s 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:03.751 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:28:04.011 /dev/nbd1 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:04.011 1+0 records in 00:28:04.011 1+0 records out 00:28:04.011 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000303738 s, 13.5 MB/s 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:04.011 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:28:04.272 { 00:28:04.272 "nbd_device": "/dev/nbd0", 00:28:04.272 "bdev_name": "crypto_ram" 00:28:04.272 }, 00:28:04.272 { 00:28:04.272 "nbd_device": "/dev/nbd1", 00:28:04.272 "bdev_name": "crypto_ram3" 00:28:04.272 } 00:28:04.272 ]' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:28:04.272 { 00:28:04.272 "nbd_device": "/dev/nbd0", 00:28:04.272 "bdev_name": "crypto_ram" 00:28:04.272 }, 00:28:04.272 { 00:28:04.272 "nbd_device": "/dev/nbd1", 00:28:04.272 "bdev_name": "crypto_ram3" 00:28:04.272 } 00:28:04.272 ]' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:28:04.272 /dev/nbd1' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:28:04.272 /dev/nbd1' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:28:04.272 256+0 records in 00:28:04.272 256+0 records out 00:28:04.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115767 s, 90.6 MB/s 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:28:04.272 256+0 records in 00:28:04.272 256+0 records out 00:28:04.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0193199 s, 54.3 MB/s 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:28:04.272 256+0 records in 00:28:04.272 256+0 records out 00:28:04.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0320864 s, 32.7 MB/s 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:04.272 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:04.532 13:56:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:04.792 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:28:05.053 malloc_lvol_verify 00:28:05.053 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:28:05.312 405b4912-40e2-4d07-a244-0a334f0de932 00:28:05.312 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:28:05.572 78a81082-48c7-4d54-845b-a242de6556aa 00:28:05.572 13:56:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:28:05.831 /dev/nbd0 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:28:05.831 mke2fs 1.46.5 (30-Dec-2021) 00:28:05.831 Discarding device blocks: 0/4096 done 00:28:05.831 Creating filesystem with 4096 1k blocks and 1024 inodes 00:28:05.831 00:28:05.831 Allocating group tables: 0/1 done 00:28:05.831 Writing inode tables: 0/1 done 00:28:05.831 Creating journal (1024 blocks): done 00:28:05.831 Writing superblocks and filesystem accounting information: 0/1 done 00:28:05.831 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:05.831 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1728217 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1728217 ']' 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1728217 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1728217 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1728217' 00:28:06.091 killing process with pid 1728217 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1728217 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1728217 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:28:06.091 00:28:06.091 real 0m5.049s 00:28:06.091 user 0m7.650s 00:28:06.091 sys 0m1.542s 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:06.091 13:56:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:06.091 ************************************ 00:28:06.091 END TEST bdev_nbd 00:28:06.091 ************************************ 00:28:06.091 13:56:20 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:28:06.091 13:56:20 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:28:06.092 13:56:20 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:28:06.092 13:56:20 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:28:06.092 13:56:20 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:06.092 13:56:20 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:06.092 13:56:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:06.353 ************************************ 00:28:06.353 START TEST bdev_fio 00:28:06.353 ************************************ 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:06.353 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:06.353 ************************************ 00:28:06.353 START TEST bdev_fio_rw_verify 00:28:06.353 ************************************ 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:06.353 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:06.354 13:56:20 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:06.922 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:06.922 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:06.922 fio-3.35 00:28:06.922 Starting 2 threads 00:28:19.142 00:28:19.142 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1729747: Mon Jun 10 13:56:31 2024 00:28:19.142 read: IOPS=28.2k, BW=110MiB/s (115MB/s)(1101MiB/10000msec) 00:28:19.142 slat (usec): min=11, max=1435, avg=14.80, stdev= 4.33 00:28:19.142 clat (usec): min=6, max=1594, avg=111.41, stdev=44.12 00:28:19.142 lat (usec): min=18, max=1610, avg=126.20, stdev=45.16 00:28:19.142 clat percentiles (usec): 00:28:19.142 | 50.000th=[ 110], 99.000th=[ 210], 99.900th=[ 235], 99.990th=[ 269], 00:28:19.142 | 99.999th=[ 1565] 00:28:19.142 write: IOPS=34.0k, BW=133MiB/s (139MB/s)(1257MiB/9480msec); 0 zone resets 00:28:19.142 slat (usec): min=11, max=532, avg=26.17, stdev= 4.33 00:28:19.142 clat (usec): min=22, max=866, avg=151.91, stdev=69.58 00:28:19.142 lat (usec): min=44, max=993, avg=178.08, stdev=70.92 00:28:19.142 clat percentiles (usec): 00:28:19.142 | 50.000th=[ 149], 99.000th=[ 302], 99.900th=[ 334], 99.990th=[ 619], 00:28:19.142 | 99.999th=[ 791] 00:28:19.142 bw ( KiB/s): min=123080, max=133160, per=94.83%, avg=128795.37, stdev=1535.77, samples=38 00:28:19.142 iops : min=30770, max=33290, avg=32198.84, stdev=383.94, samples=38 00:28:19.142 lat (usec) : 10=0.01%, 20=0.01%, 50=7.53%, 100=26.16%, 250=60.79% 00:28:19.142 lat (usec) : 500=5.50%, 750=0.01%, 1000=0.01% 00:28:19.142 lat (msec) : 2=0.01% 00:28:19.142 cpu : usr=99.70%, sys=0.01%, ctx=29, majf=0, minf=535 00:28:19.142 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:19.142 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:19.142 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:19.142 issued rwts: total=281810,321899,0,0 short=0,0,0,0 dropped=0,0,0,0 00:28:19.143 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:19.143 00:28:19.143 Run status group 0 (all jobs): 00:28:19.143 READ: bw=110MiB/s (115MB/s), 110MiB/s-110MiB/s (115MB/s-115MB/s), io=1101MiB (1154MB), run=10000-10000msec 00:28:19.143 WRITE: bw=133MiB/s (139MB/s), 133MiB/s-133MiB/s (139MB/s-139MB/s), io=1257MiB (1318MB), run=9480-9480msec 00:28:19.143 00:28:19.143 real 0m11.025s 00:28:19.143 user 0m30.419s 00:28:19.143 sys 0m0.357s 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:28:19.143 ************************************ 00:28:19.143 END TEST bdev_fio_rw_verify 00:28:19.143 ************************************ 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "3e1da837-a0ef-5681-be10-c2ae88646c91"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "3e1da837-a0ef-5681-be10-c2ae88646c91",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ad405884-3d5e-512d-9cac-fbbefca04687"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ad405884-3d5e-512d-9cac-fbbefca04687",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:28:19.143 crypto_ram3 ]] 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "3e1da837-a0ef-5681-be10-c2ae88646c91"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "3e1da837-a0ef-5681-be10-c2ae88646c91",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ad405884-3d5e-512d-9cac-fbbefca04687"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ad405884-3d5e-512d-9cac-fbbefca04687",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:19.143 ************************************ 00:28:19.143 START TEST bdev_fio_trim 00:28:19.143 ************************************ 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:19.143 13:56:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:19.143 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:19.143 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:19.143 fio-3.35 00:28:19.143 Starting 2 threads 00:28:29.134 00:28:29.134 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1731970: Mon Jun 10 13:56:42 2024 00:28:29.134 write: IOPS=53.5k, BW=209MiB/s (219MB/s)(2091MiB/10001msec); 0 zone resets 00:28:29.134 slat (usec): min=11, max=516, avg=15.88, stdev= 4.90 00:28:29.134 clat (usec): min=34, max=1312, avg=124.58, stdev=68.57 00:28:29.134 lat (usec): min=45, max=1327, avg=140.46, stdev=71.12 00:28:29.134 clat percentiles (usec): 00:28:29.134 | 50.000th=[ 99], 99.000th=[ 258], 99.900th=[ 285], 99.990th=[ 570], 00:28:29.134 | 99.999th=[ 693] 00:28:29.134 bw ( KiB/s): min=210680, max=216784, per=100.00%, avg=214137.68, stdev=662.51, samples=38 00:28:29.134 iops : min=52670, max=54196, avg=53534.42, stdev=165.63, samples=38 00:28:29.134 trim: IOPS=53.5k, BW=209MiB/s (219MB/s)(2091MiB/10001msec); 0 zone resets 00:28:29.134 slat (usec): min=5, max=1195, avg= 7.22, stdev= 2.78 00:28:29.134 clat (usec): min=38, max=1327, avg=83.42, stdev=24.70 00:28:29.134 lat (usec): min=45, max=1333, avg=90.65, stdev=24.77 00:28:29.134 clat percentiles (usec): 00:28:29.134 | 50.000th=[ 84], 99.000th=[ 139], 99.900th=[ 151], 99.990th=[ 245], 00:28:29.134 | 99.999th=[ 457] 00:28:29.134 bw ( KiB/s): min=210712, max=216776, per=100.00%, avg=214138.95, stdev=659.12, samples=38 00:28:29.134 iops : min=52678, max=54194, avg=53534.74, stdev=164.77, samples=38 00:28:29.134 lat (usec) : 50=11.44%, 100=49.60%, 250=38.01%, 500=0.95%, 750=0.01% 00:28:29.134 lat (msec) : 2=0.01% 00:28:29.134 cpu : usr=99.75%, sys=0.00%, ctx=84, majf=0, minf=274 00:28:29.134 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:29.134 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:29.134 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:29.134 issued rwts: total=0,535379,535380,0 short=0,0,0,0 dropped=0,0,0,0 00:28:29.134 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:29.134 00:28:29.134 Run status group 0 (all jobs): 00:28:29.134 WRITE: bw=209MiB/s (219MB/s), 209MiB/s-209MiB/s (219MB/s-219MB/s), io=2091MiB (2193MB), run=10001-10001msec 00:28:29.134 TRIM: bw=209MiB/s (219MB/s), 209MiB/s-209MiB/s (219MB/s-219MB/s), io=2091MiB (2193MB), run=10001-10001msec 00:28:29.134 00:28:29.134 real 0m11.092s 00:28:29.134 user 0m32.531s 00:28:29.134 sys 0m0.315s 00:28:29.134 13:56:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:29.134 13:56:42 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:28:29.134 ************************************ 00:28:29.134 END TEST bdev_fio_trim 00:28:29.134 ************************************ 00:28:29.134 13:56:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:28:29.134 13:56:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:29.134 13:56:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:28:29.135 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:29.135 13:56:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:28:29.135 00:28:29.135 real 0m22.445s 00:28:29.135 user 1m3.127s 00:28:29.135 sys 0m0.838s 00:28:29.135 13:56:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:29.135 13:56:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:29.135 ************************************ 00:28:29.135 END TEST bdev_fio 00:28:29.135 ************************************ 00:28:29.135 13:56:43 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:29.135 13:56:43 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:29.135 13:56:43 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:28:29.135 13:56:43 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:29.135 13:56:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:29.135 ************************************ 00:28:29.135 START TEST bdev_verify 00:28:29.135 ************************************ 00:28:29.135 13:56:43 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:29.135 [2024-06-10 13:56:43.175706] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:29.135 [2024-06-10 13:56:43.175765] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1733956 ] 00:28:29.135 [2024-06-10 13:56:43.267117] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:29.135 [2024-06-10 13:56:43.364536] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:28:29.135 [2024-06-10 13:56:43.364544] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:29.135 [2024-06-10 13:56:43.519014] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:29.135 [2024-06-10 13:56:43.519069] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:29.135 [2024-06-10 13:56:43.519078] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.135 [2024-06-10 13:56:43.527032] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:29.135 [2024-06-10 13:56:43.527045] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:29.135 [2024-06-10 13:56:43.527051] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.135 [2024-06-10 13:56:43.535053] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:29.135 [2024-06-10 13:56:43.535065] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:29.135 [2024-06-10 13:56:43.535071] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:29.135 Running I/O for 5 seconds... 00:28:34.512 00:28:34.512 Latency(us) 00:28:34.512 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:34.513 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:34.513 Verification LBA range: start 0x0 length 0x800 00:28:34.513 crypto_ram : 5.02 7962.74 31.10 0.00 0.00 16020.07 1297.07 17476.27 00:28:34.513 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:34.513 Verification LBA range: start 0x800 length 0x800 00:28:34.513 crypto_ram : 5.01 7926.11 30.96 0.00 0.00 16090.45 1283.41 18677.76 00:28:34.513 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:34.513 Verification LBA range: start 0x0 length 0x800 00:28:34.513 crypto_ram3 : 5.02 3979.77 15.55 0.00 0.00 32002.52 5789.01 24685.23 00:28:34.513 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:34.513 Verification LBA range: start 0x800 length 0x800 00:28:34.513 crypto_ram3 : 5.02 3978.61 15.54 0.00 0.00 32018.11 1583.79 24685.23 00:28:34.513 =================================================================================================================== 00:28:34.513 Total : 23847.24 93.15 0.00 0.00 21384.93 1283.41 24685.23 00:28:34.513 00:28:34.513 real 0m5.623s 00:28:34.513 user 0m10.708s 00:28:34.513 sys 0m0.178s 00:28:34.513 13:56:48 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:34.513 13:56:48 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:28:34.513 ************************************ 00:28:34.513 END TEST bdev_verify 00:28:34.513 ************************************ 00:28:34.513 13:56:48 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:34.513 13:56:48 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:28:34.513 13:56:48 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:34.513 13:56:48 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:34.513 ************************************ 00:28:34.513 START TEST bdev_verify_big_io 00:28:34.513 ************************************ 00:28:34.513 13:56:48 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:34.513 [2024-06-10 13:56:48.866029] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:34.513 [2024-06-10 13:56:48.866079] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1735153 ] 00:28:34.513 [2024-06-10 13:56:48.953593] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:34.774 [2024-06-10 13:56:49.028860] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:28:34.774 [2024-06-10 13:56:49.028865] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:34.774 [2024-06-10 13:56:49.168549] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:34.774 [2024-06-10 13:56:49.168596] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:34.774 [2024-06-10 13:56:49.168605] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.774 [2024-06-10 13:56:49.176570] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:34.774 [2024-06-10 13:56:49.176583] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:34.774 [2024-06-10 13:56:49.176589] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.774 [2024-06-10 13:56:49.184593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:34.774 [2024-06-10 13:56:49.184604] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:34.774 [2024-06-10 13:56:49.184614] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:34.774 Running I/O for 5 seconds... 00:28:40.057 00:28:40.057 Latency(us) 00:28:40.057 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:40.057 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:40.057 Verification LBA range: start 0x0 length 0x80 00:28:40.057 crypto_ram : 5.05 456.54 28.53 0.00 0.00 273570.17 3604.48 387973.12 00:28:40.057 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:40.057 Verification LBA range: start 0x80 length 0x80 00:28:40.057 crypto_ram : 5.06 455.41 28.46 0.00 0.00 274359.74 3386.03 384477.87 00:28:40.057 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:40.057 Verification LBA range: start 0x0 length 0x80 00:28:40.057 crypto_ram3 : 5.27 267.08 16.69 0.00 0.00 450325.45 3345.07 391468.37 00:28:40.057 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:40.057 Verification LBA range: start 0x80 length 0x80 00:28:40.057 crypto_ram3 : 5.24 244.41 15.28 0.00 0.00 492236.33 3481.60 398458.88 00:28:40.057 =================================================================================================================== 00:28:40.057 Total : 1423.44 88.96 0.00 0.00 346292.66 3345.07 398458.88 00:28:40.317 00:28:40.317 real 0m5.840s 00:28:40.317 user 0m11.202s 00:28:40.317 sys 0m0.157s 00:28:40.317 13:56:54 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:40.317 13:56:54 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:28:40.317 ************************************ 00:28:40.317 END TEST bdev_verify_big_io 00:28:40.317 ************************************ 00:28:40.317 13:56:54 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:40.317 13:56:54 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:28:40.317 13:56:54 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:40.317 13:56:54 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:40.317 ************************************ 00:28:40.317 START TEST bdev_write_zeroes 00:28:40.317 ************************************ 00:28:40.317 13:56:54 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:40.317 [2024-06-10 13:56:54.779630] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:40.317 [2024-06-10 13:56:54.779678] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736234 ] 00:28:40.577 [2024-06-10 13:56:54.867721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:40.577 [2024-06-10 13:56:54.937834] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:40.837 [2024-06-10 13:56:55.076122] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:40.837 [2024-06-10 13:56:55.076171] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:40.837 [2024-06-10 13:56:55.076179] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:40.837 [2024-06-10 13:56:55.084140] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:40.837 [2024-06-10 13:56:55.084153] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:40.837 [2024-06-10 13:56:55.084159] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:40.837 [2024-06-10 13:56:55.092168] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:40.837 [2024-06-10 13:56:55.092179] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:40.837 [2024-06-10 13:56:55.092185] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:40.837 Running I/O for 1 seconds... 00:28:41.778 00:28:41.778 Latency(us) 00:28:41.778 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:41.778 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:41.778 crypto_ram : 1.01 30746.78 120.10 0.00 0.00 4153.40 1085.44 5816.32 00:28:41.778 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:41.778 crypto_ram3 : 1.01 15344.79 59.94 0.00 0.00 8289.48 5215.57 8519.68 00:28:41.778 =================================================================================================================== 00:28:41.778 Total : 46091.57 180.05 0.00 0.00 5532.09 1085.44 8519.68 00:28:42.038 00:28:42.038 real 0m1.547s 00:28:42.038 user 0m1.374s 00:28:42.038 sys 0m0.159s 00:28:42.038 13:56:56 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:42.038 13:56:56 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:42.038 ************************************ 00:28:42.038 END TEST bdev_write_zeroes 00:28:42.038 ************************************ 00:28:42.039 13:56:56 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:42.039 13:56:56 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:28:42.039 13:56:56 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:42.039 13:56:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:42.039 ************************************ 00:28:42.039 START TEST bdev_json_nonenclosed 00:28:42.039 ************************************ 00:28:42.039 13:56:56 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:42.039 [2024-06-10 13:56:56.402759] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:42.039 [2024-06-10 13:56:56.402807] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736577 ] 00:28:42.039 [2024-06-10 13:56:56.492762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:42.298 [2024-06-10 13:56:56.566678] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:42.298 [2024-06-10 13:56:56.566733] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:42.298 [2024-06-10 13:56:56.566745] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:42.298 [2024-06-10 13:56:56.566752] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:42.298 00:28:42.298 real 0m0.279s 00:28:42.298 user 0m0.171s 00:28:42.298 sys 0m0.107s 00:28:42.298 13:56:56 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:42.298 13:56:56 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:42.298 ************************************ 00:28:42.298 END TEST bdev_json_nonenclosed 00:28:42.298 ************************************ 00:28:42.298 13:56:56 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:42.298 13:56:56 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:28:42.298 13:56:56 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:42.298 13:56:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:42.298 ************************************ 00:28:42.298 START TEST bdev_json_nonarray 00:28:42.298 ************************************ 00:28:42.298 13:56:56 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:42.298 [2024-06-10 13:56:56.762833] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:42.298 [2024-06-10 13:56:56.762895] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736606 ] 00:28:42.557 [2024-06-10 13:56:56.852843] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:42.557 [2024-06-10 13:56:56.926483] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:42.558 [2024-06-10 13:56:56.926545] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:42.558 [2024-06-10 13:56:56.926558] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:42.558 [2024-06-10 13:56:56.926565] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:42.558 00:28:42.558 real 0m0.287s 00:28:42.558 user 0m0.181s 00:28:42.558 sys 0m0.104s 00:28:42.558 13:56:56 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:42.558 13:56:56 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:42.558 ************************************ 00:28:42.558 END TEST bdev_json_nonarray 00:28:42.558 ************************************ 00:28:42.558 13:56:57 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:28:42.558 13:56:57 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:28:42.558 13:56:57 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:28:42.558 13:56:57 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:28:42.558 13:56:57 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:28:42.558 13:56:57 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:42.558 13:56:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:42.817 ************************************ 00:28:42.817 START TEST bdev_crypto_enomem 00:28:42.817 ************************************ 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # bdev_crypto_enomem 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=1736640 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 1736640 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@830 -- # '[' -z 1736640 ']' 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:42.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:42.817 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:28:42.817 [2024-06-10 13:56:57.112600] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:42.817 [2024-06-10 13:56:57.112647] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1736640 ] 00:28:42.817 [2024-06-10 13:56:57.182730] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:42.817 [2024-06-10 13:56:57.247698] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@863 -- # return 0 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:43.757 true 00:28:43.757 base0 00:28:43.757 true 00:28:43.757 [2024-06-10 13:56:57.988448] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:43.757 crypt0 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_name=crypt0 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local i 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:43.757 13:56:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:43.757 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:43.757 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:43.758 [ 00:28:43.758 { 00:28:43.758 "name": "crypt0", 00:28:43.758 "aliases": [ 00:28:43.758 "17fe78cc-d2b3-5766-bdef-3f0c07ac532c" 00:28:43.758 ], 00:28:43.758 "product_name": "crypto", 00:28:43.758 "block_size": 512, 00:28:43.758 "num_blocks": 2097152, 00:28:43.758 "uuid": "17fe78cc-d2b3-5766-bdef-3f0c07ac532c", 00:28:43.758 "assigned_rate_limits": { 00:28:43.758 "rw_ios_per_sec": 0, 00:28:43.758 "rw_mbytes_per_sec": 0, 00:28:43.758 "r_mbytes_per_sec": 0, 00:28:43.758 "w_mbytes_per_sec": 0 00:28:43.758 }, 00:28:43.758 "claimed": false, 00:28:43.758 "zoned": false, 00:28:43.758 "supported_io_types": { 00:28:43.758 "read": true, 00:28:43.758 "write": true, 00:28:43.758 "unmap": false, 00:28:43.758 "write_zeroes": true, 00:28:43.758 "flush": false, 00:28:43.758 "reset": true, 00:28:43.758 "compare": false, 00:28:43.758 "compare_and_write": false, 00:28:43.758 "abort": false, 00:28:43.758 "nvme_admin": false, 00:28:43.758 "nvme_io": false 00:28:43.758 }, 00:28:43.758 "memory_domains": [ 00:28:43.758 { 00:28:43.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:43.758 "dma_device_type": 2 00:28:43.758 } 00:28:43.758 ], 00:28:43.758 "driver_specific": { 00:28:43.758 "crypto": { 00:28:43.758 "base_bdev_name": "EE_base0", 00:28:43.758 "name": "crypt0", 00:28:43.758 "key_name": "test_dek_sw" 00:28:43.758 } 00:28:43.758 } 00:28:43.758 } 00:28:43.758 ] 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # return 0 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=1736951 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:28:43.758 13:56:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:43.758 Running I/O for 5 seconds... 00:28:44.697 13:56:59 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:28:44.697 13:56:59 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:44.697 13:56:59 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:44.697 13:56:59 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:44.697 13:56:59 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 1736951 00:28:48.894 00:28:48.894 Latency(us) 00:28:48.894 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.894 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:28:48.894 crypt0 : 5.00 41526.19 162.21 0.00 0.00 766.96 375.47 1297.07 00:28:48.894 =================================================================================================================== 00:28:48.894 Total : 41526.19 162.21 0.00 0.00 766.96 375.47 1297.07 00:28:48.894 0 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 1736640 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@949 -- # '[' -z 1736640 ']' 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # kill -0 1736640 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # uname 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1736640 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1736640' 00:28:48.894 killing process with pid 1736640 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # kill 1736640 00:28:48.894 Received shutdown signal, test time was about 5.000000 seconds 00:28:48.894 00:28:48.894 Latency(us) 00:28:48.894 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.894 =================================================================================================================== 00:28:48.894 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@973 -- # wait 1736640 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:28:48.894 00:28:48.894 real 0m6.254s 00:28:48.894 user 0m6.520s 00:28:48.894 sys 0m0.254s 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:48.894 13:57:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:48.894 ************************************ 00:28:48.894 END TEST bdev_crypto_enomem 00:28:48.894 ************************************ 00:28:48.894 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:28:48.895 13:57:03 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:28:48.895 00:28:48.895 real 0m51.552s 00:28:48.895 user 1m46.841s 00:28:48.895 sys 0m4.716s 00:28:48.895 13:57:03 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:48.895 13:57:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:48.895 ************************************ 00:28:48.895 END TEST blockdev_crypto_sw 00:28:48.895 ************************************ 00:28:49.155 13:57:03 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:28:49.155 13:57:03 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:49.155 13:57:03 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:49.155 13:57:03 -- common/autotest_common.sh@10 -- # set +x 00:28:49.155 ************************************ 00:28:49.155 START TEST blockdev_crypto_qat 00:28:49.155 ************************************ 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:28:49.155 * Looking for test storage... 00:28:49.155 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1738116 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1738116 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@830 -- # '[' -z 1738116 ']' 00:28:49.155 13:57:03 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:49.155 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:49.155 13:57:03 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:49.155 [2024-06-10 13:57:03.621874] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:49.155 [2024-06-10 13:57:03.621940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1738116 ] 00:28:49.415 [2024-06-10 13:57:03.713225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:49.415 [2024-06-10 13:57:03.808678] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:50.355 13:57:04 blockdev_crypto_qat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:50.355 13:57:04 blockdev_crypto_qat -- common/autotest_common.sh@863 -- # return 0 00:28:50.355 13:57:04 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:50.355 13:57:04 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:28:50.355 13:57:04 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:28:50.355 13:57:04 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:50.355 13:57:04 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:50.355 [2024-06-10 13:57:04.490736] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:50.355 [2024-06-10 13:57:04.498771] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:50.355 [2024-06-10 13:57:04.506782] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:50.355 [2024-06-10 13:57:04.580305] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:52.896 true 00:28:52.896 true 00:28:52.896 true 00:28:52.896 true 00:28:52.896 Malloc0 00:28:52.896 Malloc1 00:28:52.896 Malloc2 00:28:52.896 Malloc3 00:28:52.896 [2024-06-10 13:57:06.958692] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:52.896 crypto_ram 00:28:52.896 [2024-06-10 13:57:06.966710] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:52.896 crypto_ram1 00:28:52.897 [2024-06-10 13:57:06.974734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:52.897 crypto_ram2 00:28:52.897 [2024-06-10 13:57:06.982759] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:52.897 crypto_ram3 00:28:52.897 [ 00:28:52.897 { 00:28:52.897 "name": "Malloc1", 00:28:52.897 "aliases": [ 00:28:52.897 "74be650e-7f8c-44b2-b5f4-f0f09847c352" 00:28:52.897 ], 00:28:52.897 "product_name": "Malloc disk", 00:28:52.897 "block_size": 512, 00:28:52.897 "num_blocks": 65536, 00:28:52.897 "uuid": "74be650e-7f8c-44b2-b5f4-f0f09847c352", 00:28:52.897 "assigned_rate_limits": { 00:28:52.897 "rw_ios_per_sec": 0, 00:28:52.897 "rw_mbytes_per_sec": 0, 00:28:52.897 "r_mbytes_per_sec": 0, 00:28:52.897 "w_mbytes_per_sec": 0 00:28:52.897 }, 00:28:52.897 "claimed": true, 00:28:52.897 "claim_type": "exclusive_write", 00:28:52.897 "zoned": false, 00:28:52.897 "supported_io_types": { 00:28:52.897 "read": true, 00:28:52.897 "write": true, 00:28:52.897 "unmap": true, 00:28:52.897 "write_zeroes": true, 00:28:52.897 "flush": true, 00:28:52.897 "reset": true, 00:28:52.897 "compare": false, 00:28:52.897 "compare_and_write": false, 00:28:52.897 "abort": true, 00:28:52.897 "nvme_admin": false, 00:28:52.897 "nvme_io": false 00:28:52.897 }, 00:28:52.897 "memory_domains": [ 00:28:52.897 { 00:28:52.897 "dma_device_id": "system", 00:28:52.897 "dma_device_type": 1 00:28:52.897 }, 00:28:52.897 { 00:28:52.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:52.897 "dma_device_type": 2 00:28:52.897 } 00:28:52.897 ], 00:28:52.897 "driver_specific": {} 00:28:52.897 } 00:28:52.897 ] 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c83d6f04-c65f-54cb-9f83-a1f117cd8ecf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c83d6f04-c65f-54cb-9f83-a1f117cd8ecf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9e21ed51-3386-51b8-be0d-2d5dfa69b307"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9e21ed51-3386-51b8-be0d-2d5dfa69b307",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3f3f2ac6-7dce-585c-b1e7-8b2b28ae79da"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3f3f2ac6-7dce-585c-b1e7-8b2b28ae79da",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c803179c-c93c-587a-9852-1efcdadfe26b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c803179c-c93c-587a-9852-1efcdadfe26b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:52.897 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 1738116 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@949 -- # '[' -z 1738116 ']' 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # kill -0 1738116 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # uname 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1738116 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1738116' 00:28:52.897 killing process with pid 1738116 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # kill 1738116 00:28:52.897 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@973 -- # wait 1738116 00:28:53.158 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:53.158 13:57:07 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:53.158 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:28:53.158 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:53.158 13:57:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:53.158 ************************************ 00:28:53.158 START TEST bdev_hello_world 00:28:53.158 ************************************ 00:28:53.158 13:57:07 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:53.417 [2024-06-10 13:57:07.667185] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:53.417 [2024-06-10 13:57:07.667243] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1738791 ] 00:28:53.417 [2024-06-10 13:57:07.758861] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.417 [2024-06-10 13:57:07.854541] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.417 [2024-06-10 13:57:07.875802] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:53.417 [2024-06-10 13:57:07.883834] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:53.418 [2024-06-10 13:57:07.891845] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:53.676 [2024-06-10 13:57:07.998463] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:56.216 [2024-06-10 13:57:10.199095] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:56.216 [2024-06-10 13:57:10.199147] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:56.216 [2024-06-10 13:57:10.199156] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.216 [2024-06-10 13:57:10.207114] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:56.216 [2024-06-10 13:57:10.207126] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:56.216 [2024-06-10 13:57:10.207132] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.216 [2024-06-10 13:57:10.215134] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:56.216 [2024-06-10 13:57:10.215145] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:56.216 [2024-06-10 13:57:10.215151] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.216 [2024-06-10 13:57:10.223154] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:56.216 [2024-06-10 13:57:10.223168] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:56.217 [2024-06-10 13:57:10.223174] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.217 [2024-06-10 13:57:10.285622] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:56.217 [2024-06-10 13:57:10.285648] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:56.217 [2024-06-10 13:57:10.285659] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:56.217 [2024-06-10 13:57:10.286759] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:56.217 [2024-06-10 13:57:10.286814] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:56.217 [2024-06-10 13:57:10.286824] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:56.217 [2024-06-10 13:57:10.286859] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:56.217 00:28:56.217 [2024-06-10 13:57:10.286871] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:56.217 00:28:56.217 real 0m2.891s 00:28:56.217 user 0m2.565s 00:28:56.217 sys 0m0.282s 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:56.217 ************************************ 00:28:56.217 END TEST bdev_hello_world 00:28:56.217 ************************************ 00:28:56.217 13:57:10 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:56.217 13:57:10 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:56.217 13:57:10 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:56.217 13:57:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:56.217 ************************************ 00:28:56.217 START TEST bdev_bounds 00:28:56.217 ************************************ 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1739779 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1739779' 00:28:56.217 Process bdevio pid: 1739779 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1739779 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 1739779 ']' 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:56.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:56.217 13:57:10 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:56.217 [2024-06-10 13:57:10.635530] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:56.217 [2024-06-10 13:57:10.635581] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1739779 ] 00:28:56.477 [2024-06-10 13:57:10.726103] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:56.477 [2024-06-10 13:57:10.794904] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:28:56.477 [2024-06-10 13:57:10.795039] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 2 00:28:56.477 [2024-06-10 13:57:10.795042] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.477 [2024-06-10 13:57:10.816275] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:56.477 [2024-06-10 13:57:10.824304] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:56.477 [2024-06-10 13:57:10.832322] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:56.477 [2024-06-10 13:57:10.928634] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:59.016 [2024-06-10 13:57:13.070718] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:59.016 [2024-06-10 13:57:13.070774] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:59.016 [2024-06-10 13:57:13.070783] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.016 [2024-06-10 13:57:13.078734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:59.016 [2024-06-10 13:57:13.078746] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:59.016 [2024-06-10 13:57:13.078752] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.016 [2024-06-10 13:57:13.086755] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:59.016 [2024-06-10 13:57:13.086770] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:59.016 [2024-06-10 13:57:13.086776] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.016 [2024-06-10 13:57:13.094776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:59.016 [2024-06-10 13:57:13.094786] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:59.016 [2024-06-10 13:57:13.094792] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:59.016 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:59.016 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:28:59.016 13:57:13 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:59.016 I/O targets: 00:28:59.016 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:28:59.016 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:28:59.016 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:28:59.016 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:28:59.016 00:28:59.016 00:28:59.016 CUnit - A unit testing framework for C - Version 2.1-3 00:28:59.016 http://cunit.sourceforge.net/ 00:28:59.016 00:28:59.016 00:28:59.016 Suite: bdevio tests on: crypto_ram3 00:28:59.016 Test: blockdev write read block ...passed 00:28:59.016 Test: blockdev write zeroes read block ...passed 00:28:59.016 Test: blockdev write zeroes read no split ...passed 00:28:59.016 Test: blockdev write zeroes read split ...passed 00:28:59.016 Test: blockdev write zeroes read split partial ...passed 00:28:59.016 Test: blockdev reset ...passed 00:28:59.016 Test: blockdev write read 8 blocks ...passed 00:28:59.016 Test: blockdev write read size > 128k ...passed 00:28:59.016 Test: blockdev write read invalid size ...passed 00:28:59.016 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:59.016 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:59.016 Test: blockdev write read max offset ...passed 00:28:59.016 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:59.016 Test: blockdev writev readv 8 blocks ...passed 00:28:59.016 Test: blockdev writev readv 30 x 1block ...passed 00:28:59.016 Test: blockdev writev readv block ...passed 00:28:59.016 Test: blockdev writev readv size > 128k ...passed 00:28:59.016 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:59.016 Test: blockdev comparev and writev ...passed 00:28:59.016 Test: blockdev nvme passthru rw ...passed 00:28:59.016 Test: blockdev nvme passthru vendor specific ...passed 00:28:59.016 Test: blockdev nvme admin passthru ...passed 00:28:59.016 Test: blockdev copy ...passed 00:28:59.016 Suite: bdevio tests on: crypto_ram2 00:28:59.016 Test: blockdev write read block ...passed 00:28:59.016 Test: blockdev write zeroes read block ...passed 00:28:59.016 Test: blockdev write zeroes read no split ...passed 00:28:59.016 Test: blockdev write zeroes read split ...passed 00:28:59.016 Test: blockdev write zeroes read split partial ...passed 00:28:59.016 Test: blockdev reset ...passed 00:28:59.016 Test: blockdev write read 8 blocks ...passed 00:28:59.016 Test: blockdev write read size > 128k ...passed 00:28:59.016 Test: blockdev write read invalid size ...passed 00:28:59.016 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:59.016 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:59.016 Test: blockdev write read max offset ...passed 00:28:59.016 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:59.016 Test: blockdev writev readv 8 blocks ...passed 00:28:59.016 Test: blockdev writev readv 30 x 1block ...passed 00:28:59.016 Test: blockdev writev readv block ...passed 00:28:59.016 Test: blockdev writev readv size > 128k ...passed 00:28:59.016 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:59.016 Test: blockdev comparev and writev ...passed 00:28:59.016 Test: blockdev nvme passthru rw ...passed 00:28:59.016 Test: blockdev nvme passthru vendor specific ...passed 00:28:59.016 Test: blockdev nvme admin passthru ...passed 00:28:59.016 Test: blockdev copy ...passed 00:28:59.016 Suite: bdevio tests on: crypto_ram1 00:28:59.016 Test: blockdev write read block ...passed 00:28:59.016 Test: blockdev write zeroes read block ...passed 00:28:59.016 Test: blockdev write zeroes read no split ...passed 00:28:59.016 Test: blockdev write zeroes read split ...passed 00:28:59.016 Test: blockdev write zeroes read split partial ...passed 00:28:59.016 Test: blockdev reset ...passed 00:28:59.016 Test: blockdev write read 8 blocks ...passed 00:28:59.016 Test: blockdev write read size > 128k ...passed 00:28:59.016 Test: blockdev write read invalid size ...passed 00:28:59.016 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:59.016 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:59.016 Test: blockdev write read max offset ...passed 00:28:59.016 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:59.016 Test: blockdev writev readv 8 blocks ...passed 00:28:59.016 Test: blockdev writev readv 30 x 1block ...passed 00:28:59.016 Test: blockdev writev readv block ...passed 00:28:59.016 Test: blockdev writev readv size > 128k ...passed 00:28:59.016 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:59.016 Test: blockdev comparev and writev ...passed 00:28:59.016 Test: blockdev nvme passthru rw ...passed 00:28:59.016 Test: blockdev nvme passthru vendor specific ...passed 00:28:59.016 Test: blockdev nvme admin passthru ...passed 00:28:59.016 Test: blockdev copy ...passed 00:28:59.016 Suite: bdevio tests on: crypto_ram 00:28:59.016 Test: blockdev write read block ...passed 00:28:59.016 Test: blockdev write zeroes read block ...passed 00:28:59.016 Test: blockdev write zeroes read no split ...passed 00:28:59.016 Test: blockdev write zeroes read split ...passed 00:28:59.276 Test: blockdev write zeroes read split partial ...passed 00:28:59.276 Test: blockdev reset ...passed 00:28:59.276 Test: blockdev write read 8 blocks ...passed 00:28:59.276 Test: blockdev write read size > 128k ...passed 00:28:59.276 Test: blockdev write read invalid size ...passed 00:28:59.276 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:59.276 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:59.276 Test: blockdev write read max offset ...passed 00:28:59.277 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:59.277 Test: blockdev writev readv 8 blocks ...passed 00:28:59.277 Test: blockdev writev readv 30 x 1block ...passed 00:28:59.277 Test: blockdev writev readv block ...passed 00:28:59.277 Test: blockdev writev readv size > 128k ...passed 00:28:59.277 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:59.277 Test: blockdev comparev and writev ...passed 00:28:59.277 Test: blockdev nvme passthru rw ...passed 00:28:59.277 Test: blockdev nvme passthru vendor specific ...passed 00:28:59.277 Test: blockdev nvme admin passthru ...passed 00:28:59.277 Test: blockdev copy ...passed 00:28:59.277 00:28:59.277 Run Summary: Type Total Ran Passed Failed Inactive 00:28:59.277 suites 4 4 n/a 0 0 00:28:59.277 tests 92 92 92 0 0 00:28:59.277 asserts 520 520 520 0 n/a 00:28:59.277 00:28:59.277 Elapsed time = 0.474 seconds 00:28:59.277 0 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1739779 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 1739779 ']' 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 1739779 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1739779 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1739779' 00:28:59.277 killing process with pid 1739779 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # kill 1739779 00:28:59.277 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@973 -- # wait 1739779 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:59.537 00:28:59.537 real 0m3.246s 00:28:59.537 user 0m9.297s 00:28:59.537 sys 0m0.381s 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:59.537 ************************************ 00:28:59.537 END TEST bdev_bounds 00:28:59.537 ************************************ 00:28:59.537 13:57:13 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:28:59.537 13:57:13 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:28:59.537 13:57:13 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:59.537 13:57:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:59.537 ************************************ 00:28:59.537 START TEST bdev_nbd 00:28:59.537 ************************************ 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:28:59.537 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1740597 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1740597 /var/tmp/spdk-nbd.sock 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 1740597 ']' 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:59.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:59.538 13:57:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:59.538 [2024-06-10 13:57:13.959791] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:28:59.538 [2024-06-10 13:57:13.959834] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:59.797 [2024-06-10 13:57:14.047715] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:59.797 [2024-06-10 13:57:14.112673] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:28:59.797 [2024-06-10 13:57:14.133754] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:59.797 [2024-06-10 13:57:14.141780] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:59.797 [2024-06-10 13:57:14.149795] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:59.797 [2024-06-10 13:57:14.234490] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:02.336 [2024-06-10 13:57:16.367683] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:02.337 [2024-06-10 13:57:16.367744] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:02.337 [2024-06-10 13:57:16.367753] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:02.337 [2024-06-10 13:57:16.375701] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:02.337 [2024-06-10 13:57:16.375713] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:02.337 [2024-06-10 13:57:16.375719] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:02.337 [2024-06-10 13:57:16.383721] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:02.337 [2024-06-10 13:57:16.383731] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:02.337 [2024-06-10 13:57:16.383737] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:02.337 [2024-06-10 13:57:16.391742] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:02.337 [2024-06-10 13:57:16.391753] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:02.337 [2024-06-10 13:57:16.391759] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:02.337 1+0 records in 00:29:02.337 1+0 records out 00:29:02.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000637217 s, 6.4 MB/s 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:02.337 13:57:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:02.597 1+0 records in 00:29:02.597 1+0 records out 00:29:02.597 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259578 s, 15.8 MB/s 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:02.597 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:02.856 1+0 records in 00:29:02.856 1+0 records out 00:29:02.856 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243145 s, 16.8 MB/s 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:02.856 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:03.116 1+0 records in 00:29:03.116 1+0 records out 00:29:03.116 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334991 s, 12.2 MB/s 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:29:03.116 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:03.375 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:29:03.375 { 00:29:03.375 "nbd_device": "/dev/nbd0", 00:29:03.375 "bdev_name": "crypto_ram" 00:29:03.375 }, 00:29:03.375 { 00:29:03.375 "nbd_device": "/dev/nbd1", 00:29:03.375 "bdev_name": "crypto_ram1" 00:29:03.375 }, 00:29:03.375 { 00:29:03.375 "nbd_device": "/dev/nbd2", 00:29:03.375 "bdev_name": "crypto_ram2" 00:29:03.375 }, 00:29:03.375 { 00:29:03.375 "nbd_device": "/dev/nbd3", 00:29:03.375 "bdev_name": "crypto_ram3" 00:29:03.375 } 00:29:03.375 ]' 00:29:03.375 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:29:03.376 { 00:29:03.376 "nbd_device": "/dev/nbd0", 00:29:03.376 "bdev_name": "crypto_ram" 00:29:03.376 }, 00:29:03.376 { 00:29:03.376 "nbd_device": "/dev/nbd1", 00:29:03.376 "bdev_name": "crypto_ram1" 00:29:03.376 }, 00:29:03.376 { 00:29:03.376 "nbd_device": "/dev/nbd2", 00:29:03.376 "bdev_name": "crypto_ram2" 00:29:03.376 }, 00:29:03.376 { 00:29:03.376 "nbd_device": "/dev/nbd3", 00:29:03.376 "bdev_name": "crypto_ram3" 00:29:03.376 } 00:29:03.376 ]' 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.376 13:57:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.635 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.895 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:04.155 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:04.415 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:04.675 13:57:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:29:04.675 /dev/nbd0 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:04.675 1+0 records in 00:29:04.675 1+0 records out 00:29:04.675 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212684 s, 19.3 MB/s 00:29:04.675 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:04.935 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:04.935 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:04.935 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:04.935 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:04.935 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:04.935 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:29:04.936 /dev/nbd1 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:04.936 1+0 records in 00:29:04.936 1+0 records out 00:29:04.936 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274381 s, 14.9 MB/s 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:04.936 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:29:05.195 /dev/nbd10 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:05.195 1+0 records in 00:29:05.195 1+0 records out 00:29:05.195 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298079 s, 13.7 MB/s 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:05.195 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:29:05.455 /dev/nbd11 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:05.455 1+0 records in 00:29:05.455 1+0 records out 00:29:05.455 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271466 s, 15.1 MB/s 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:05.455 13:57:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:05.715 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:05.715 { 00:29:05.715 "nbd_device": "/dev/nbd0", 00:29:05.715 "bdev_name": "crypto_ram" 00:29:05.715 }, 00:29:05.715 { 00:29:05.715 "nbd_device": "/dev/nbd1", 00:29:05.715 "bdev_name": "crypto_ram1" 00:29:05.715 }, 00:29:05.715 { 00:29:05.715 "nbd_device": "/dev/nbd10", 00:29:05.715 "bdev_name": "crypto_ram2" 00:29:05.715 }, 00:29:05.715 { 00:29:05.715 "nbd_device": "/dev/nbd11", 00:29:05.715 "bdev_name": "crypto_ram3" 00:29:05.715 } 00:29:05.715 ]' 00:29:05.715 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:05.715 { 00:29:05.715 "nbd_device": "/dev/nbd0", 00:29:05.715 "bdev_name": "crypto_ram" 00:29:05.715 }, 00:29:05.715 { 00:29:05.715 "nbd_device": "/dev/nbd1", 00:29:05.716 "bdev_name": "crypto_ram1" 00:29:05.716 }, 00:29:05.716 { 00:29:05.716 "nbd_device": "/dev/nbd10", 00:29:05.716 "bdev_name": "crypto_ram2" 00:29:05.716 }, 00:29:05.716 { 00:29:05.716 "nbd_device": "/dev/nbd11", 00:29:05.716 "bdev_name": "crypto_ram3" 00:29:05.716 } 00:29:05.716 ]' 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:05.716 /dev/nbd1 00:29:05.716 /dev/nbd10 00:29:05.716 /dev/nbd11' 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:05.716 /dev/nbd1 00:29:05.716 /dev/nbd10 00:29:05.716 /dev/nbd11' 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:05.716 256+0 records in 00:29:05.716 256+0 records out 00:29:05.716 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116714 s, 89.8 MB/s 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:05.716 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:05.975 256+0 records in 00:29:05.975 256+0 records out 00:29:05.975 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0683529 s, 15.3 MB/s 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:05.975 256+0 records in 00:29:05.975 256+0 records out 00:29:05.975 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0448029 s, 23.4 MB/s 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:29:05.975 256+0 records in 00:29:05.975 256+0 records out 00:29:05.975 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0330803 s, 31.7 MB/s 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:29:05.975 256+0 records in 00:29:05.975 256+0 records out 00:29:05.975 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0386155 s, 27.2 MB/s 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:29:05.975 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:05.976 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:06.235 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:06.495 13:57:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:06.754 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:07.013 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:07.014 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:07.273 malloc_lvol_verify 00:29:07.273 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:07.533 30adb499-2374-41dd-aa2d-dc5182eb9a8c 00:29:07.533 13:57:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:07.792 c00bbab5-0723-4223-8b25-ce1518823537 00:29:07.792 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:08.051 /dev/nbd0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:08.051 mke2fs 1.46.5 (30-Dec-2021) 00:29:08.051 Discarding device blocks: 0/4096 done 00:29:08.051 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:08.051 00:29:08.051 Allocating group tables: 0/1 done 00:29:08.051 Writing inode tables: 0/1 done 00:29:08.051 Creating journal (1024 blocks): done 00:29:08.051 Writing superblocks and filesystem accounting information: 0/1 done 00:29:08.051 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1740597 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 1740597 ']' 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 1740597 00:29:08.051 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1740597 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1740597' 00:29:08.310 killing process with pid 1740597 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # kill 1740597 00:29:08.310 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@973 -- # wait 1740597 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:08.570 00:29:08.570 real 0m8.900s 00:29:08.570 user 0m12.429s 00:29:08.570 sys 0m2.482s 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:08.570 ************************************ 00:29:08.570 END TEST bdev_nbd 00:29:08.570 ************************************ 00:29:08.570 13:57:22 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:08.570 13:57:22 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:29:08.570 13:57:22 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:29:08.570 13:57:22 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:08.570 13:57:22 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:29:08.570 13:57:22 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:08.570 13:57:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:08.570 ************************************ 00:29:08.570 START TEST bdev_fio 00:29:08.570 ************************************ 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:08.570 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:29:08.570 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:08.571 ************************************ 00:29:08.571 START TEST bdev_fio_rw_verify 00:29:08.571 ************************************ 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:29:08.571 13:57:22 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:08.571 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:08.571 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:08.571 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:08.571 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:08.571 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:29:08.571 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:08.849 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:08.849 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:08.849 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:08.849 13:57:23 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:09.109 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:09.109 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:09.109 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:09.109 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:09.109 fio-3.35 00:29:09.109 Starting 4 threads 00:29:24.087 00:29:24.087 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1743259: Mon Jun 10 13:57:36 2024 00:29:24.087 read: IOPS=32.7k, BW=128MiB/s (134MB/s)(1278MiB/10001msec) 00:29:24.087 slat (usec): min=15, max=884, avg=39.66, stdev=25.47 00:29:24.087 clat (usec): min=21, max=1821, avg=239.60, stdev=166.24 00:29:24.087 lat (usec): min=39, max=2041, avg=279.26, stdev=180.37 00:29:24.087 clat percentiles (usec): 00:29:24.087 | 50.000th=[ 182], 99.000th=[ 816], 99.900th=[ 971], 99.990th=[ 1172], 00:29:24.087 | 99.999th=[ 1647] 00:29:24.087 write: IOPS=35.8k, BW=140MiB/s (147MB/s)(1364MiB/9740msec); 0 zone resets 00:29:24.087 slat (usec): min=15, max=556, avg=50.33, stdev=25.05 00:29:24.087 clat (usec): min=16, max=1897, avg=273.87, stdev=168.58 00:29:24.087 lat (usec): min=46, max=2122, avg=324.20, stdev=182.43 00:29:24.087 clat percentiles (usec): 00:29:24.087 | 50.000th=[ 231], 99.000th=[ 824], 99.900th=[ 996], 99.990th=[ 1221], 00:29:24.087 | 99.999th=[ 1663] 00:29:24.087 bw ( KiB/s): min=123816, max=163952, per=97.80%, avg=140200.00, stdev=2656.77, samples=76 00:29:24.087 iops : min=30954, max=40990, avg=35050.00, stdev=664.28, samples=76 00:29:24.087 lat (usec) : 20=0.01%, 50=0.02%, 100=12.10%, 250=48.34%, 500=30.24% 00:29:24.087 lat (usec) : 750=7.33%, 1000=1.89% 00:29:24.087 lat (msec) : 2=0.08% 00:29:24.087 cpu : usr=99.72%, sys=0.01%, ctx=43, majf=0, minf=240 00:29:24.087 IO depths : 1=0.2%, 2=28.6%, 4=57.0%, 8=14.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:24.087 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:24.087 complete : 0=0.0%, 4=87.5%, 8=12.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:24.087 issued rwts: total=327126,349070,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:24.087 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:24.087 00:29:24.087 Run status group 0 (all jobs): 00:29:24.087 READ: bw=128MiB/s (134MB/s), 128MiB/s-128MiB/s (134MB/s-134MB/s), io=1278MiB (1340MB), run=10001-10001msec 00:29:24.087 WRITE: bw=140MiB/s (147MB/s), 140MiB/s-140MiB/s (147MB/s-147MB/s), io=1364MiB (1430MB), run=9740-9740msec 00:29:24.087 00:29:24.087 real 0m13.293s 00:29:24.087 user 0m51.379s 00:29:24.087 sys 0m0.423s 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:24.087 ************************************ 00:29:24.087 END TEST bdev_fio_rw_verify 00:29:24.087 ************************************ 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:29:24.087 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c83d6f04-c65f-54cb-9f83-a1f117cd8ecf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c83d6f04-c65f-54cb-9f83-a1f117cd8ecf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9e21ed51-3386-51b8-be0d-2d5dfa69b307"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9e21ed51-3386-51b8-be0d-2d5dfa69b307",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3f3f2ac6-7dce-585c-b1e7-8b2b28ae79da"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3f3f2ac6-7dce-585c-b1e7-8b2b28ae79da",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c803179c-c93c-587a-9852-1efcdadfe26b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c803179c-c93c-587a-9852-1efcdadfe26b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:24.088 crypto_ram1 00:29:24.088 crypto_ram2 00:29:24.088 crypto_ram3 ]] 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c83d6f04-c65f-54cb-9f83-a1f117cd8ecf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c83d6f04-c65f-54cb-9f83-a1f117cd8ecf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9e21ed51-3386-51b8-be0d-2d5dfa69b307"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9e21ed51-3386-51b8-be0d-2d5dfa69b307",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3f3f2ac6-7dce-585c-b1e7-8b2b28ae79da"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3f3f2ac6-7dce-585c-b1e7-8b2b28ae79da",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "c803179c-c93c-587a-9852-1efcdadfe26b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c803179c-c93c-587a-9852-1efcdadfe26b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:24.088 ************************************ 00:29:24.088 START TEST bdev_fio_trim 00:29:24.088 ************************************ 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:29:24.088 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:29:24.089 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:29:24.089 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:24.089 13:57:36 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:24.089 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:24.089 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:24.089 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:24.089 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:24.089 fio-3.35 00:29:24.089 Starting 4 threads 00:29:36.316 00:29:36.316 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1745909: Mon Jun 10 13:57:49 2024 00:29:36.316 write: IOPS=60.6k, BW=237MiB/s (248MB/s)(2366MiB/10001msec); 0 zone resets 00:29:36.316 slat (usec): min=10, max=803, avg=41.41, stdev=32.15 00:29:36.316 clat (usec): min=17, max=1018, avg=139.11, stdev=97.02 00:29:36.316 lat (usec): min=37, max=1037, avg=180.52, stdev=116.85 00:29:36.316 clat percentiles (usec): 00:29:36.316 | 50.000th=[ 114], 99.000th=[ 494], 99.900th=[ 619], 99.990th=[ 676], 00:29:36.316 | 99.999th=[ 725] 00:29:36.316 bw ( KiB/s): min=235840, max=245664, per=100.00%, avg=242246.32, stdev=632.35, samples=76 00:29:36.316 iops : min=58960, max=61416, avg=60561.68, stdev=158.08, samples=76 00:29:36.316 trim: IOPS=60.6k, BW=237MiB/s (248MB/s)(2366MiB/10001msec); 0 zone resets 00:29:36.316 slat (usec): min=3, max=158, avg= 8.19, stdev= 4.22 00:29:36.316 clat (usec): min=4, max=1037, avg=180.68, stdev=116.86 00:29:36.316 lat (usec): min=14, max=1044, avg=188.87, stdev=118.09 00:29:36.316 clat percentiles (usec): 00:29:36.316 | 50.000th=[ 147], 99.000th=[ 594], 99.900th=[ 742], 99.990th=[ 807], 00:29:36.316 | 99.999th=[ 881] 00:29:36.316 bw ( KiB/s): min=235840, max=245664, per=100.00%, avg=242246.74, stdev=632.33, samples=76 00:29:36.316 iops : min=58960, max=61416, avg=60561.68, stdev=158.08, samples=76 00:29:36.316 lat (usec) : 10=0.01%, 20=0.01%, 50=5.86%, 100=26.77%, 250=51.96% 00:29:36.316 lat (usec) : 500=13.57%, 750=1.79%, 1000=0.04% 00:29:36.316 lat (msec) : 2=0.01% 00:29:36.316 cpu : usr=99.70%, sys=0.01%, ctx=142, majf=0, minf=87 00:29:36.316 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:36.316 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:36.316 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:36.316 issued rwts: total=0,605693,605694,0 short=0,0,0,0 dropped=0,0,0,0 00:29:36.316 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:36.316 00:29:36.316 Run status group 0 (all jobs): 00:29:36.316 WRITE: bw=237MiB/s (248MB/s), 237MiB/s-237MiB/s (248MB/s-248MB/s), io=2366MiB (2481MB), run=10001-10001msec 00:29:36.316 TRIM: bw=237MiB/s (248MB/s), 237MiB/s-237MiB/s (248MB/s-248MB/s), io=2366MiB (2481MB), run=10001-10001msec 00:29:36.316 00:29:36.316 real 0m13.395s 00:29:36.316 user 0m53.532s 00:29:36.316 sys 0m0.487s 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:36.316 ************************************ 00:29:36.316 END TEST bdev_fio_trim 00:29:36.316 ************************************ 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:36.316 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:36.316 00:29:36.316 real 0m27.027s 00:29:36.316 user 1m45.090s 00:29:36.316 sys 0m1.086s 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:36.316 13:57:49 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:36.316 ************************************ 00:29:36.316 END TEST bdev_fio 00:29:36.316 ************************************ 00:29:36.316 13:57:49 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:36.316 13:57:49 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:36.316 13:57:49 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:29:36.316 13:57:49 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:36.317 13:57:49 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:36.317 ************************************ 00:29:36.317 START TEST bdev_verify 00:29:36.317 ************************************ 00:29:36.317 13:57:49 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:36.317 [2024-06-10 13:57:50.044832] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:29:36.317 [2024-06-10 13:57:50.044894] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1747892 ] 00:29:36.317 [2024-06-10 13:57:50.139672] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:36.317 [2024-06-10 13:57:50.218848] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:29:36.317 [2024-06-10 13:57:50.218853] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:29:36.317 [2024-06-10 13:57:50.239998] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:36.317 [2024-06-10 13:57:50.248030] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:36.317 [2024-06-10 13:57:50.256045] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:36.317 [2024-06-10 13:57:50.348373] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:38.232 [2024-06-10 13:57:52.483218] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:38.232 [2024-06-10 13:57:52.483279] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:38.232 [2024-06-10 13:57:52.483287] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:38.232 [2024-06-10 13:57:52.491236] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:38.232 [2024-06-10 13:57:52.491248] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:38.232 [2024-06-10 13:57:52.491255] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:38.232 [2024-06-10 13:57:52.499256] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:38.232 [2024-06-10 13:57:52.499268] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:38.232 [2024-06-10 13:57:52.499273] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:38.232 [2024-06-10 13:57:52.507278] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:38.232 [2024-06-10 13:57:52.507289] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:38.232 [2024-06-10 13:57:52.507294] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:38.232 Running I/O for 5 seconds... 00:29:43.516 00:29:43.516 Latency(us) 00:29:43.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:43.516 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x0 length 0x1000 00:29:43.516 crypto_ram : 5.07 606.04 2.37 0.00 0.00 210938.02 3399.68 152917.33 00:29:43.516 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x1000 length 0x1000 00:29:43.516 crypto_ram : 5.06 606.60 2.37 0.00 0.00 210747.33 4314.45 152043.52 00:29:43.516 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x0 length 0x1000 00:29:43.516 crypto_ram1 : 5.07 605.92 2.37 0.00 0.00 210458.55 3713.71 138062.51 00:29:43.516 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x1000 length 0x1000 00:29:43.516 crypto_ram1 : 5.07 606.48 2.37 0.00 0.00 210261.93 4696.75 137188.69 00:29:43.516 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x0 length 0x1000 00:29:43.516 crypto_ram2 : 5.05 4712.56 18.41 0.00 0.00 26952.75 5215.57 23811.41 00:29:43.516 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x1000 length 0x1000 00:29:43.516 crypto_ram2 : 5.05 4739.57 18.51 0.00 0.00 26798.53 4942.51 23046.83 00:29:43.516 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x0 length 0x1000 00:29:43.516 crypto_ram3 : 5.06 4708.20 18.39 0.00 0.00 26909.50 5543.25 23265.28 00:29:43.516 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:43.516 Verification LBA range: start 0x1000 length 0x1000 00:29:43.516 crypto_ram3 : 5.05 4737.98 18.51 0.00 0.00 26748.42 5133.65 23592.96 00:29:43.516 =================================================================================================================== 00:29:43.516 Total : 21323.35 83.29 0.00 0.00 47802.14 3399.68 152917.33 00:29:43.516 00:29:43.516 real 0m7.917s 00:29:43.516 user 0m15.227s 00:29:43.516 sys 0m0.239s 00:29:43.516 13:57:57 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:43.516 13:57:57 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:43.516 ************************************ 00:29:43.516 END TEST bdev_verify 00:29:43.516 ************************************ 00:29:43.516 13:57:57 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:43.516 13:57:57 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:29:43.516 13:57:57 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:43.516 13:57:57 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:43.516 ************************************ 00:29:43.516 START TEST bdev_verify_big_io 00:29:43.516 ************************************ 00:29:43.516 13:57:57 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:43.776 [2024-06-10 13:57:58.046270] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:29:43.776 [2024-06-10 13:57:58.046323] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1749406 ] 00:29:43.776 [2024-06-10 13:57:58.137718] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:43.776 [2024-06-10 13:57:58.216040] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:29:43.776 [2024-06-10 13:57:58.216046] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:29:43.776 [2024-06-10 13:57:58.237198] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:43.776 [2024-06-10 13:57:58.245227] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:44.036 [2024-06-10 13:57:58.253243] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:44.036 [2024-06-10 13:57:58.340645] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:46.577 [2024-06-10 13:58:00.479463] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:46.577 [2024-06-10 13:58:00.479515] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:46.577 [2024-06-10 13:58:00.479524] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:46.577 [2024-06-10 13:58:00.487479] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:46.577 [2024-06-10 13:58:00.487492] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:46.577 [2024-06-10 13:58:00.487498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:46.577 [2024-06-10 13:58:00.495501] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:46.577 [2024-06-10 13:58:00.495512] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:46.577 [2024-06-10 13:58:00.495518] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:46.577 [2024-06-10 13:58:00.503522] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:46.577 [2024-06-10 13:58:00.503533] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:46.577 [2024-06-10 13:58:00.503539] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:46.577 Running I/O for 5 seconds... 00:29:47.150 [2024-06-10 13:58:01.332268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.332632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.332687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.332723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.332757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.332789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.333122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.333131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.336643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.336679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.336711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.336743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.337183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.337217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.337248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.337281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.337615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.337625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.341017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.341051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.341083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.341117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.341475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.150 [2024-06-10 13:58:01.341509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.341541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.341573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.341917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.341926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.344949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.344984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.345812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.348920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.348954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.348985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.349839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.353898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.358950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.363699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.364153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.364165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.367902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.367936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.367968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.368883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.371695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.371739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.371772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.371806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.372198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.372231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.372271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.372304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.372653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.372662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.374905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.374939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.374985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.151 [2024-06-10 13:58:01.375957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.379903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.383936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.384259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.384268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.386979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.387942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.390562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.390596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.390628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.390660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.390998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.391032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.391064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.391095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.391443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.391452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.394913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.395269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.395278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.397596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.397631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.397662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.397695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.398097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.398130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.398165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.398197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.398533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.398545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.400739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.400774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.400809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.400840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.401192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.401226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.401257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.401289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.401606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.401615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.404497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.404530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.404562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.404593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.404968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.405004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.405037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.405068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.152 [2024-06-10 13:58:01.405482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.405491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.408885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.408919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.408950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.408982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.409321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.409355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.409387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.409419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.409739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.409747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.413719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.414046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.414055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.417675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.417709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.417741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.417772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.418097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.418133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.418167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.418200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.418574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.418583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.420629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.420666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.420698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.420730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.421085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.421135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.421171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.421203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.421538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.421548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.423952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.423987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.424908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.426994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.427943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.430811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.431190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.431200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.433546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.433580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.433613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.433645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.434012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.434045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.434076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.434107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.434522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.153 [2024-06-10 13:58:01.434531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.436810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.437069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.437077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.438884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.438918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.438950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.438981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.439325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.439359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.439390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.439422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.439761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.439773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.441919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.442186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.442195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.444986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.445024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.445359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.445369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.446944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.446979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.447647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.450262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.450589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.451341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.452633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.454418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.455783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.457027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.458305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.458545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.458554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.461273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.461597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.463172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.464789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.466568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.467263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.468558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.470118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.470364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.470374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.472951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.474452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.475829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.477390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.478302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.479649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.481211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.482769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.483010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.483019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.154 [2024-06-10 13:58:01.486145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.487438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.488996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.490558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.492547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.494130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.495773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.497265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.497624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.497633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.500937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.502462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.504026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.504802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.506317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.507856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.509400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.509722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.510091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.510101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.513456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.515020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.516276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.517659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.519536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.521109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.521696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.522018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.522367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.522376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.526001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.527559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.528614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.529914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.531718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.532740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.533064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.533387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.533757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.533765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.537170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.537819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.539140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.540698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.542548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.542872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.543195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.543521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.543976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.543988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.546651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.548171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.549567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.551124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.551909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.552236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.552557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.552895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.553244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.553253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.555571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.556867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.155 [2024-06-10 13:58:01.558426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.559999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.560731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.561054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.561377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.561698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.561990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.561999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.565031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.566656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.568228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.569532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.570209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.570532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.570853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.571257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.571495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.571508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.574265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.575825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.577396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.577751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.578510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.578832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.579153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.580574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.580880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.580889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.584155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.585722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.586628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.586958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.587621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.587944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.588795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.590076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.590320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.590329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.593372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.595011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.595336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.595658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.596296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.596620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.598167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.599794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.600033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.600043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.603172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.603786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.604107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.604431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.605073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.606302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.607589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.609148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.609390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.609400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.612232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.612556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.612881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.613205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.614178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.615467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.617007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.618557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.618819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.618828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.620586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.620908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.621232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.156 [2024-06-10 13:58:01.621554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.623442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.624892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.626484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.628092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.628404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.628413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.630210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.630538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.630860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.631183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.632713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.634280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.635839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.636431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.636672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.636681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.638545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.638870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.639195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.640046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.641867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.643433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.644642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.646049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.646319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.646329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.648193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.648517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.648924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.650261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.652052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.653673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.654643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.655941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.656184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.656193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.659031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.659362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.660988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.662493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.664284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.664943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.666237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.667802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.668042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.668051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.670322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.671564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.672838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.674398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.675491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.677087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.678708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.680265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.680507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.680516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.683836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.685124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.686676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.688234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.690110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.691481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.693032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.694661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.694947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.694956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.698006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.699542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.701119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.702029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.703563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.705096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.706653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.707072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.707458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.707467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.710760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.712316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.713592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.714936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.716746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.718294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.719046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.719377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.719743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.719752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.422 [2024-06-10 13:58:01.722953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.724577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.725492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.726777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.728581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.729662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.729982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.730305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.730670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.730679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.734115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.734876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.736377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.738022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.739815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.740138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.740462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.740784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.741122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.741132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.743157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.744436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.745913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.747140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.747812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.748134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.748457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.748782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.749255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.749264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.752120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.752447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.752770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.753093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.753765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.754089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.754419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.754759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.755178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.755189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.757438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.757763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.758086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.758414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.759069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.759396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.759737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.760059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.760530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.760540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.762768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.763092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.763418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.763445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.764079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.764406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.764727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.765047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.765388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.765397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.767402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.767726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.768047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.768371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.768404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.768802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.769127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.769451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.769772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.770097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.770384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.770394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.772639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.772678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.772710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.772743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.773526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.775740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.775773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.775807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.775838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.776117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.776153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.776188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.776219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.776252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.423 [2024-06-10 13:58:01.776575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.776583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.780721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.781138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.781150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.784756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.785183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.785192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.787624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.788116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.788124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.790629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.791028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.791037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.792761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.792795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.792827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.792858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.793695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.795700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.795733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.795768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.795816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.796646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.799973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.800004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.800036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.800399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.800408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.802833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.803175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.803184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.805167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.805201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.424 [2024-06-10 13:58:01.805232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.805264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.805595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.805631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.805678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.805709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.805740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.806062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.806071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.808995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.810699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.810733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.810765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.810796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.811778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.814849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.815189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.815199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.817717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.818032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.818041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.820942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.822855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.822888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.822920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.822950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.823763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.825879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.825912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.825945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.825976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.826757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.425 [2024-06-10 13:58:01.829756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.829787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.830416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.830425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.832539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.832572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.832614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.832656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.832996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.833047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.833084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.833117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.833149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.833551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.833560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.835960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.835993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.836754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.838554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.838587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.838621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.838652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.838950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.838986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.839018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.839049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.839080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.839350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.839359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.840758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.840791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.840823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.840854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.841484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.843885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.844121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.844129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.845989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.846020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.846051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.846290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.846299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.848668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.848703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.848737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.848769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.849171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.849208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.426 [2024-06-10 13:58:01.849240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.849270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.849303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.849595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.849604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.850989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.851713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.853810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.854160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.854181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.855527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.855560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.855592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.855623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.856497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.858920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.860361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.860394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.861802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.862039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.862048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.863978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.864829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.865086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.865095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.868117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.869734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.871312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.872746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.873082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.873409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.873731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.874051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.874903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.875190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.875199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.877859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.879403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.880972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.881448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.881939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.882270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.882609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.882928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.884479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.884717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.884726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.427 [2024-06-10 13:58:01.887838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.889391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.890541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.890864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.891223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.891548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.891870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.428 [2024-06-10 13:58:01.892960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.894244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.894484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.894493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.897397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.898971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.899296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.899618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.899934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.900262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.900682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.902013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.903572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.903810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.903819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.906805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.907728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.908051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.908378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.908837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.909164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.910521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.911809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.913365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.913605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.913614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.916683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.917009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.917333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.917654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.918001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.918696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.919985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.921520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.923093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.923441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.923449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.925428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.925750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.926071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.926396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.926711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.928333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.929846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.931463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.933004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.933368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.933378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.935138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.935472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.935794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.936114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.936358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.937663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.939227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.940793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.941459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.941700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.941709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.943487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.943811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.944133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.945138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.945503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.947070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.948607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.949658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.691 [2024-06-10 13:58:01.951241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.951499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.951508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.953387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.953711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.954177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.955467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.955705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.957290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.958909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.959927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.961212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.961454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.961463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.963307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.963632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.965168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.966565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.966804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.968362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.969035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.970320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.971879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.972116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.972125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.974414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.975438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.976728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.978293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.978532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.979565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.981154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.982620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.984234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.984473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.984482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.987414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.988694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.990245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.991806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.992149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.993377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.994659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.996221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.997786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.998110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:01.998119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.001938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.003482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.005037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.006291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.006652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.007943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.009489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.011049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.011723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.012080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.012090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.015021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.016571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.018130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.018883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.019172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.020742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.022294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.023587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.023921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.024259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.024268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.027350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.028913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.029597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.030942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.031191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.032771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.034401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.034723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.035047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.035351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.035360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.038406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.039526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.041057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.042452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.042693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.044262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.044783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.045104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.045428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.045846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.045858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.048990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.692 [2024-06-10 13:58:02.049908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.051200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.052728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.052968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.054136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.054464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.054789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.055110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.055461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.055470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.057593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.059105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.060725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.062291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.062531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.062857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.063182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.063505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.063826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.064138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.064148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.066635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.067911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.069467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.071031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.071531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.071871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.072197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.072525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.072850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.073088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.073097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.075763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.077327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.078878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.079898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.080277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.080602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.080925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.081249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.082510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.082849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.082861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.085642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.087198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.088821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.089143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.089519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.089847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.090173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.090822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.092099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.092343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.092353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.095291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.096856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.097520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.097843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.098192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.098520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.098842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.100421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.101889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.102127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.102137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.105083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.106376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.106699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.107020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.107391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.107718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.108675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.109961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.111527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.111766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.111774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.114761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.115098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.115424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.115744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.116146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.116474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.118012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.119635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.121114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.121429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.121438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.123175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.123500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.123823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.124151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.124484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.124809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.693 [2024-06-10 13:58:02.125131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.125456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.125778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.126119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.126129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.128383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.128723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.129049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.129374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.129751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.130080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.130407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.130729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.131051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.131331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.131339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.134708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.135036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.135375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.135697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.136096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.136425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.136748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.137070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.137398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.137764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.137773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.140388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.140712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.141034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.141381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.141696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.142020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.142347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.142671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.142993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.143350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.143361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.145792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.146117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.146443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.146768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.147295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.147621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.147942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.148267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.148589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.149023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.149032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.151960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.152289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.152323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.152644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.153039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.153368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.153691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.154012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.154338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.154763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.154771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.157649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.157974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.158300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.158333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.158683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.159026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.159351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.159674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.160000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.160323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.160333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.162890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.163301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.694 [2024-06-10 13:58:02.163311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.165704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.166023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.166032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.167985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.168953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.171613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.171649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.171680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.171711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.172594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.174892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.174926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.174957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.174988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.175850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.177624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.177658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.177690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.177721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.178606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.180510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.180545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.180576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.180607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.180938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.180974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.181006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.181038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.181069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.181422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.181431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.183547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.183581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.183615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.183646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.184522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.186322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.186356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.186406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.186438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.186888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.959 [2024-06-10 13:58:02.186926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.186958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.186989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.187020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.187334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.187343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.189532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.189566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.189598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.189629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.189966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.190001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.190033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.190065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.190096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.190383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.190394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.192798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.193149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.193158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.194994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.195907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.198864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.199259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.199268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.201871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.201905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.201936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.201973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.202743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.205883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.205917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.205964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.205997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.206715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.208727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.208761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.208795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.208827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.209608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.211995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.960 [2024-06-10 13:58:02.212029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.212268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.212277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.213700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.213734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.213765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.213795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.214449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.216939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.218819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.219054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.219063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.220940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.220974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.221859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.223833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.224113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.224122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.225578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.225612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.225644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.225679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.226581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.228831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.230864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.231183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.231193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.232678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.232718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.961 [2024-06-10 13:58:02.232749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.232780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.233488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.234845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.234879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.234910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.234941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.235754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.237973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.238005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.238248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.238257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.239706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.239739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.239771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.239801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.240658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.242995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.243235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.243244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.244715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.244749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.246816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.247146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.247155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.248753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.248787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.248818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.250321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.250561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.250598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.250630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.250661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.250692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.251122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.251131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.252820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.253145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.253470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.253793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.254110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.255393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.256946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.258504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.259611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.259856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.259865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.261588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.261913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.262253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.262575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.262816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.264110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.265664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.267219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.962 [2024-06-10 13:58:02.268043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.268314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.268324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.270142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.270471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.270793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.272083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.272357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.273919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.275484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.276240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.277736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.277977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.277986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.279941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.280269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.280971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.282251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.282490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.284058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.285445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.286672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.287958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.288201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.288209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.291057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.291386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.292974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.294586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.294826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.296400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.297175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.298458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.300022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.300266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.300280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.302422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.303846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.305136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.306700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.306940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.307607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.308996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.310549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.312096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.312338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.312348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.315520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.316803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.318360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.319920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.320243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.321865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.323429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.325073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.326547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.326949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.326961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.330686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.332227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.333800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.334634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.334874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.336167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.337714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.339266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.339592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.339965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.339974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.343428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.344995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.346289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.347621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.347898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.349479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.963 [2024-06-10 13:58:02.351053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.351615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.351937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.352281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.352291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.356321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.357947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.358936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.360213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.360452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.362012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.363082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.363409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.363734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.364077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.364086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.367479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.368156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.369497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.371056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.371299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.372920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.373245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.373565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.373886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.374288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.374298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.376946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.378335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.379598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.381138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.381383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.382036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.382366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.382688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.383013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.383339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.383348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.385379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.386680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.388247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.389812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.390069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.390397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.390724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.391044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.391368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.391636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.391646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.394669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.396285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.397860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.399304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.399662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.399986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.400312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.400634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.401267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.401511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.401519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.404180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.405749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.407300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.408008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.408401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.408726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.409048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.409372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.410945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.411229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.411239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.414356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.415919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.417259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.417586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.417926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.418254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.418577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.419305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.420606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.420845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.420855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.423797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.425348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.425765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.426086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.426435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.426776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.427098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.428606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.430240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.430481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.964 [2024-06-10 13:58:02.430491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.433487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.434595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.434917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.435242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.435623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.435949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.436930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.438208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.439773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.440015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.440024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.442989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.443324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.443646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.443967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.444383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.444914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.446201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.447743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.449298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.449541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.449551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.451770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.452112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.452438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.452778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.453128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.454391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.455668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.457225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.458797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.459131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.459140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.460833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.461159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.461486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.461807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.462084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.463368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.464921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.466478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.467297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.467541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.467551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.469265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.469588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.469909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.470233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.470472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.471766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.473212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.229 [2024-06-10 13:58:02.473870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.475156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.475400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.475409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.477764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.478724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.480006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.481607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.481848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.482891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.484066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.485343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.486938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.487181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.487190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.489188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.489516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.489839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.490182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.490513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.490839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.491165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.491491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.491816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.492116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.492125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.494688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.495015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.495340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.495662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.495993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.496322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.496644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.496966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.497291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.497604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.497613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.500422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.500746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.501068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.501394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.501736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.502062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.502393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.502715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.503041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.503395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.503404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.506323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.506648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.506987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.507326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.507744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.508068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.508393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.508714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.509035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.509355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.509364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.511379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.511707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.512029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.512355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.512657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.512982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.513307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.513642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.513973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.514473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.514483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.516589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.516924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.517250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.517573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.517915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.518244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.518567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.518891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.519215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.519589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.519599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.521774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.522099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.522428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.522750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.523100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.523427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.523750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.230 [2024-06-10 13:58:02.524070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.524394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.524714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.524724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.527249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.527576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.527609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.527931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.528260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.528584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.528907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.529233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.529556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.529913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.529922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.532241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.532568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.532894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.532926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.533393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.533718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.534056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.534381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.534706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.535112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.535124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.537905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.538374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.538384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.540816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.540853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.540886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.540917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.541778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.543775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.543810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.543842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.543873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.544812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.546668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.546703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.546734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.546768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.547595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.549567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.549604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.549636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.549667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.550574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.552909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.552943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.552974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.231 [2024-06-10 13:58:02.553022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.553940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.556469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.556503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.556534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.556565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.556944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.556979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.557011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.557042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.557074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.557436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.557447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.559583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.559617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.559652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.559684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.560592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.562896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.563262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.563271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.565713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.566079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.566088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.567726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.567760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.567791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.567837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.568778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.571822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.572120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.572129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.574977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.576337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.576371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.576402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.576433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.232 [2024-06-10 13:58:02.576846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.576881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.576912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.576943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.576975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.577321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.577330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.579984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.581372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.581405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.581439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.581470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.581936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.581990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.582023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.582054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.582086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.582473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.582482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.584825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.585063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.585071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.586934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.587343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.587352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.589598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.589632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.589663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.589694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.589975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.590010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.590042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.590073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.590105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.590346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.590355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.591778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.591811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.591845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.591876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.592623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.595729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.595764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.595795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.595825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.596530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.597923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.597957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.233 [2024-06-10 13:58:02.597988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.598643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.600856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.600890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.600925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.600956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.601672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.603820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.605687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.605721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.605753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.605785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.606553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.607957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.607991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.608720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.610789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.610823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.610855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.610886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.611621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.234 [2024-06-10 13:58:02.613548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.613786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.613794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.615446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.615480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.615511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.615556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.615987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.616025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.616058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.616089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.616120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.616461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.616470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.617812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.617846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.617877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.617908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.618597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.620153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.620190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.620511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.620544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.620953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.620989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.621024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.621055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.621085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.621372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.621381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.622807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.622844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.622876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.624858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.627101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.628169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.629453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.631015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.631257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.632283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.633886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.635358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.636998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.637242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.637252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.640233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.641529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.643056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.644614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.644895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.646108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.647392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.648943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.650500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.650839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.650848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.655412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.657019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.658578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.659890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.660145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.661431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.662997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.664537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.665288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.665701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.665709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.668763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.670321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.671926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.672808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.673078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.235 [2024-06-10 13:58:02.674626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.676176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.677382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.677704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.678047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.678057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.681175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.682740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.683411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.684789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.685028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.686591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.688216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.688538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.688859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.689172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.689182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.692753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.694004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.695384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.696673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.696913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.698464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.699172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.699496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.699816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.700232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.236 [2024-06-10 13:58:02.700242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.703251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.703971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.705255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.706813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.707054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.708370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.708693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.709013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.709337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.709675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.709683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.711987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.713614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.715223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.716806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.717046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.717376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.717699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.718020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.718346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.718685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.718694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.721107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.722387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.723946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.725506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.725846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.726174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.726497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.726817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.727137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.727384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.727393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.730246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.731793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.733348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.734570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.734943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.735270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.735593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.735918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.736833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.737086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.737095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.739786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.741347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.742916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.743241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.743615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.743941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.744266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.744588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.746039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.746281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.746291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.749314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.750898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.751679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.752018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.752370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.752695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.753020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.754349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.755639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.755878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.755887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.758867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.760240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.760563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.760885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.761190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.761523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.762285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.763578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.765128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.500 [2024-06-10 13:58:02.765391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.765402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.768363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.768688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.769011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.769334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.769727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.770052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.771061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.772334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.773904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.774143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.774153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.777150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.777479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.777801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.778123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.778453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.779083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.780350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.781913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.783472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.783730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.783739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.785637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.785964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.786292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.786616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.786984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.788608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.790113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.791740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.793277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.793639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.793647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.795363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.795686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.796007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.796330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.796593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.797889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.799418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.800979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.801646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.801884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.801893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.803628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.803954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.804278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.804927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.805206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.806828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.808379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.809779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.811007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.811304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.811313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.813293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.813626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.813947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.815577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.815837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.817396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.818959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.819632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.820916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.821155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.821167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.823455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.823785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.824469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.826022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.826265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.827532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.828504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.830064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.831626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.831990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.831999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.836002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.837636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.839188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.840549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.840928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.842219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.843794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.845349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.846483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.846925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.846934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.849713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.850038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.850372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.501 [2024-06-10 13:58:02.850695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.851048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.851376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.851720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.852050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.852375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.852715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.852726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.854887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.855216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.855537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.855859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.856188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.856514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.856840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.857165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.857487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.857824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.857833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.860115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.860446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.860769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.861091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.861472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.861797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.862122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.862446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.862771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.863101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.863111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.865190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.865516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.865838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.866164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.866503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.866829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.867150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.867474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.867799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.868159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.868172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.870464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.870790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.871113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.871437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.871762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.872087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.872419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.872741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.873062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.873408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.873418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.876327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.876651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.876973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.877300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.877677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.878002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.878326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.878648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.878969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.879316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.879326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.881400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.881728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.882049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.882372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.882697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.883023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.883347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.883671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.883996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.884283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.884293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.888074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.888402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.888723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.889046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.889401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.890842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.891168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.891489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.893006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.893365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.893373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.895365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.895694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.897181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.502 [2024-06-10 13:58:02.897503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.897829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.898153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.898478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.899663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.900238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.900547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.900556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.902821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.903913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.903946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.904473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.904819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.905702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.906566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.906889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.907214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.907625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.907634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.910486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.910812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.911134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.911168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.911563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.912237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.913315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.913637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.914414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.914658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.914669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.917925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.919601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.919635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.919667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.919698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.920446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.922829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.923118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.923128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.925779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.926106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.926116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.927832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.927865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.927896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.927931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.928604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.930788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.930822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.930853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.930884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.931250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.503 [2024-06-10 13:58:02.931288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.931320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.931351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.931382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.931735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.931745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.933541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.933592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.933624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.933656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.934595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.936893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.936927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.936961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.936992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.937831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.939676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.939710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.939744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.939775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.940668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.942733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.943076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.943086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.944951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.944985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.945989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.947813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.947861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.947896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.947927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.948290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.948325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.948356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.948387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.948418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.504 [2024-06-10 13:58:02.948750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.948759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.950692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.950725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.950756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.950787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.951401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.952980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.953535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.954035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.954044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.955849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.955883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.955914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.955944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.956559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.958869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.962765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.963127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.963136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.967986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.968017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.968048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.968390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.505 [2024-06-10 13:58:02.968400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.972979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.973011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.973270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.973280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.975835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.975872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.975906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.975937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.976553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.980867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.768 [2024-06-10 13:58:02.981322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.981331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.984681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.984717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.984748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.984779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.985399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.989901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.989944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.989975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.990626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.995168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.995205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.995586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.995618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.995650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.995995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.999284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.999320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.999351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:02.999382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.004315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.009692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.013527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.065924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.065986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.066289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.075914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.077450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.077490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.078123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.078164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.078467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.078783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.078792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.082492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.083965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.085110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.086399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.088198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.089155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.089479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.089800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.090216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.090225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.093442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.094125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.095407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.096971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.098698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.099021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.099344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.099665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.100011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.100019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.102397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.104019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.105581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.107231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.107890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.108216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.108537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.108862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.109188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.109197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.111636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.112940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.114474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.116031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.116692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.117015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.117338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.117661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.117897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.117906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.120646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.122207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.123774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.769 [2024-06-10 13:58:03.124822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.125523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.125846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.126170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.127332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.127603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.127611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.130559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.132182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.133721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.134043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.134730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.135054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.135710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.136993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.137239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.137248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.140192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.141753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.142195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.142518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.143189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.143512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.145089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.146715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.146953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.146963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.150016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.150948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.151276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.151597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.152348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.153658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.154946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.156516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.156754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.156764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.159579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.159905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.160231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.160552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.161687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.162965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.164522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.166081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.166391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.166400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.168090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.168418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.168740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.169061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.170862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.172465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.174029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.175359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.175645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.175654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.177505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.177830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.178157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.178482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.179998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.181581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.183192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.184075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.184331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.184341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.186113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.186444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.186765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.188104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.189929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.191491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.192257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.193764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.194003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.194016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.195867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.196193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.196924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.198205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.200074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.201076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.202348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.203782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.204101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.204109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.206390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.206714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.208172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.209498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.770 [2024-06-10 13:58:03.211297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.211964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.213357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.214923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.215165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.215174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.217382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.218266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.219555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.221153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.222564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.223626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.224913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.226493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.226732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.226744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.228813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.229140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.229465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.229807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.230462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.230785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.231107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.231434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.231795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.231805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.234244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.234570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.234890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.235213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.235859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.236194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.236517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.236838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.237185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.237194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.239404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.239728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.240048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.240372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.241027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.241361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.771 [2024-06-10 13:58:03.241681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.242013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.242359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.242369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.244982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.245314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.245637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.245957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.033 [2024-06-10 13:58:03.246607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.246930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.247253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.247574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.247930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.247940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.249960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.250299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.250621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.250942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.251566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.251892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.252221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.252546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.252995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.253004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.255106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.255432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.255754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.256075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.256708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.257032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.257357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.257677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.258050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.258060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.260188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.260516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.260838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.261160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.261843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.262167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.262488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.262809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.263182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.263192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.265786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.266110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.266446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.266784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.267469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.267791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.268111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.268433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.268867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.268877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.271810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.272137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.272461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.272494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.273154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.273480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.273801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.274126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.274596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.274605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.276638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.276966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.277289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.277609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.278270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.278312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.278634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.278666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.279024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.279032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.281600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.281924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.281956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.282279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.282947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.282981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.283306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.283338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.283660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.283669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.286376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.286702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.286736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.287056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.287705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.287738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.288059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.288091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.288415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.288425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.290367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.290689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.290724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.291045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.291741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.291775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.034 [2024-06-10 13:58:03.292094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.292126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.292431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.292440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.294727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.295050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.295082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.295404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.296118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.296151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.296477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.296510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.296867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.296876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.299921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.301475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.301509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.302099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.302664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.302698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.303017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.303049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.303350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.303360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.305390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.305712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.305747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.305771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.306460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.306495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.306816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.306848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.307127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.307136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.309698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.309732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.309763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.309794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.311303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.311337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.312900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.312932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.313172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.313181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.314697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.314731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.314762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.314792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.315099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.315132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.315167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.315199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.315610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.315619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.317857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.318093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.318102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.319987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.320407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.320417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.322999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.323725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.325886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.035 [2024-06-10 13:58:03.328262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.328970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.330922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.331157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.331169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.333748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.334066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.334074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.335929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.336169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.336178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.338978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.340930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.341170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.341179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.343917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.344214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.344223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.345587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.345620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.345651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.345682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.454755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.456038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.456079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.457615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.459877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.461395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.462778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.464333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.464609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.465266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.465306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.466574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.468138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.469697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.469993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.470002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.474127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.475418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.476951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.036 [2024-06-10 13:58:03.478495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.480373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.482009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.483559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.484952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.485283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.485292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.488174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.489743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.491293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.492041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.493556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.495120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.496663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.496985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.497396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.497404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.500814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.502381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.503635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.505008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.037 [2024-06-10 13:58:03.506821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.508358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.509070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.509395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.509723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.509732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.513465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.515055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.515919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.517197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.518981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.520263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.520584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.520906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.521248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.521256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.524271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.525136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.526739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.528357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.530154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.530511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.530834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.531155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.531588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.531597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.534598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.535655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.536941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.538506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.539803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.540126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.540452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.540773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.541114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.541124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.543959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.545514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.547136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.547461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.548099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.548424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.548919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.550216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.550455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.550464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.553406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.554961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.555742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.556077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.556720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.557044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.558423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.559713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.559951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.559960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.562935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.564491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.564815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.565136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.565779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.566101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.566424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.566762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.567121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.567130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.569302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.569643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.569965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.570288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.570950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.571277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.571600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.571921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.572298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.572307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.301 [2024-06-10 13:58:03.574706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.575033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.575356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.575678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.576310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.576635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.576955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.577281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.577633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.577642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.579863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.580191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.580512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.580834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.581558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.581593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.581914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.581949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.582293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.582303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.584999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.585327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.585649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.585980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.586634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.586977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.587009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.587333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.587648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.587658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.589848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.590174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.590496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.590835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.591485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.591519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.591841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.592165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.592476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.592485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.594686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.595015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.595340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.595374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.595747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.596070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.596394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.596427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.596758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.596767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.598938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.599268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.599592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.599914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.600581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.600905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.600937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.601260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.601610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.601619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.604015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.604341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.604663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.604696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.605497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.605531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.605851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.606174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.606519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.606528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.608542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.608865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.608897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.609221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.609619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.609941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.610264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.610297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.610687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.610700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.613042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.613077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.613400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.613721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.614385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.614707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.614740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.615060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.615458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.615468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.617208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.617531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.302 [2024-06-10 13:58:03.617854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.617886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.618552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.618587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.618930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.619254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.619592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.619601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.622459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.622783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.622816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.623137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.623513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.623837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.624159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.624197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.624717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.624730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.626906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.626960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.627284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.627604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.628346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.628667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.628700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.629020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.629338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.629348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.631200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.631524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.631845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.631877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.632602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.632636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.632955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.632987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.633370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.633379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.635240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.635564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.635882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.635915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.636577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.636613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.636932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.636965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.637328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.637340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.639563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.639888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.639920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.640243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.640969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.641005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.641347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.641379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.641653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.641661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.643492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.643527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.644460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.644493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.646328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.646362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.647920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.647955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.648249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.648259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.650045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.650080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.650112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.650137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.650849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.650884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.651209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.651243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.651558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.651566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.652932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.652965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.652997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.653028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.654550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.654584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.656143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.656178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.656417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.656426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.658181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.658214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.658245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.658276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.303 [2024-06-10 13:58:03.658937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.658972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.659955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.659988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.660261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.660270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.661648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.661681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.661713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.661744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.663532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.663566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.665195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.665228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.665658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.665667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.667295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.667332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.667363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.667396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.669199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.669233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.670792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.670825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.671250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.671260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.672594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.672628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.672659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.672690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.674012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.674047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.674078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.674109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.674469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.674478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.676727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.678977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.682614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.682647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.682678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.682711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.683050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.683082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.683113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.683144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.683381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.683390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.684797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.684831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.684862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.684893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.685153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.685190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.685221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.685254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.685490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.685499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.687777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.688137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.688145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.689501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.689535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.689566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.689597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.690019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.690052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.690082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.304 [2024-06-10 13:58:03.690114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.690383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.690392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.691849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.691884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.691916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.691950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.692318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.692352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.692393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.692424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.692778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.692787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.694943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.696818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.697189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.697198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.698741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.698775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.698806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.698837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.699188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.699222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.699253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.699284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.699518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.699527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.700936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.700970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.701674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.704494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.704528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.704560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.704591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.704876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.706332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.706389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.707937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.708181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.708190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.709587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.709620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.709652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.709683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.710072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.711676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.714499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.714535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.714566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.714597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.714907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.716465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.720477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.720782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.305 [2024-06-10 13:58:03.730717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.735964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.736010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.737536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.741430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.741474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.743011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.743043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.744716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.744750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.745404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.745438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.745703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.745744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.746045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.746076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.746108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.746445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.748177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.749737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.749771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.749802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.750073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.751345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.751379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.751410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.753014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.753273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.753283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.755189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.755227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.755259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.755580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.755911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.755944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.757219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.757253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.757490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.757499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.758911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.758944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.760468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.760500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.760764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.761351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.761383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.762695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.763097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.763106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.765508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.765545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.767095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.767128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.767411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.768613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.768645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.769925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.770167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.770176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.772089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.772417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.772455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.772486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.306 [2024-06-10 13:58:03.772839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.774283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.774317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.775796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.776038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.776047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.781741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.781779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.782229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.782256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.782658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.783997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.784031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.784354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.784688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.784696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.786032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.786701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.787994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.788027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.789573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.789814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.789851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.791062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.791095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.791418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.791766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.791774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.795566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.797059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.798650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.800196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.800434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.800470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.801319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.801352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.802112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.802444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.802453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.805599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.807160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.569 [2024-06-10 13:58:03.808716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.809383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.809622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.809659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.811238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.811282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.812808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.813050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.813059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.817293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.818858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.819816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.821434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.821700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.821737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.823295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.823328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.824948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.825301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.825311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.827114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.827898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.829185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.830740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.830978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.831014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.832158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.832195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.833815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.834085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.834094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.837990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.839276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.840825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.842380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.842618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.842654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.843823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.845107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.846648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.846886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.846901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.850187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.850513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.850908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.852255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.852495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.854056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.855695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.856658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.857943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.858186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.858195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.863789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.865121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.866663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.868257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.868553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.869849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.871395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.872953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.874076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.874353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.874363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.876889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.878271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.879565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.881115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.881359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.882041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.883460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.885018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.886583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.886822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.886832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.891036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.892597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.893456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.895062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.895308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.896850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.898410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.898976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.900170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.900557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.900565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.903400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.904852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.906202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.907464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.907823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.909268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.570 [2024-06-10 13:58:03.910315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.910637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.910958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.911380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.911390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.916154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.917751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.919298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.920625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.920898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.921931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.922256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.923160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.923992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.924339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.924347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.926821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.928364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.929799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.931377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.931616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.932124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.932451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.932770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.933093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.933419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.933428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.939398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.940958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.942043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.943056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.943356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.943682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.944408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.945422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.945745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.946015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.946025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.948052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.948380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.948704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.949858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.950151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.950488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.951440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.952246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.952570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.952852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.952865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.955828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.956619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.957578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.957901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.958203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.959440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.959763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.960085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.960409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.960798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.960807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.963506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.963983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.965270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.965592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.965898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.967424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.967746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.968066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.968394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.968811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.968820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.972814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.973147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.973473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.974928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.975308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.975633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.975667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.975988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.976025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.976423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.976432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.979273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.979597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.981168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.981491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.981851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.983418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.983453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.983774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.983806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.984125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.984135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.571 [2024-06-10 13:58:03.987383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.987710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.989351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.989674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.990017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.991474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.991511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.991832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.991866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.992223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.992232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.994568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.994604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.994925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.995252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.995596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.997139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.997177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.997498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.997818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.998057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:03.998065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.001053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.001383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.001424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.001744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.002136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.002176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.003694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.004016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.004340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.004579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.004593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.007394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.007429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.007749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.008088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.008428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.008464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.008785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.009106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.010576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.010972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.010980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.013312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.013656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.013977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.014023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.014482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.014518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.014840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.015165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.015486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.015728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.015737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.018320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.018645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.018678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.019000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.019457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.019496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.019816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.020138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.020462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.020797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.020806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.023262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.023300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.023620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.023942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.024267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.024302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.024624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.024947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.025272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.025565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.025575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.027545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.028775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.029310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.029343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.029688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.029724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.030045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.030371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.030405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.030810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.030818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.034299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.034626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.034658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.036089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.572 [2024-06-10 13:58:04.036517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.036842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.037167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.037200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.037520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.037841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.037851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.039822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.039858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.041128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.041595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.573 [2024-06-10 13:58:04.041935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.042991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.043026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.043581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.043613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.043949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.043958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.048256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.048294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.048615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.048937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.049325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.050702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.050736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.051056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.051088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.051402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.051413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.053114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.053443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.055015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.055049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.055501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.055826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.055861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.057342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.057375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.057788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.057797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.062897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.062935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.063259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.064181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.064433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.065999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.066033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.067578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.067611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.067914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.067924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.069282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.069317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.069958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.069991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.070237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.836 [2024-06-10 13:58:04.070562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.070597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.071264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.071297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.071550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.071568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.074864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.074901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.074941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.074973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.075214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.076620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.076654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.078269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.078302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.078540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.078548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.080452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.080486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.080517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.080548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.080913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.081240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.081273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.082658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.082691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.083030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.083039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.087249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.087286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.087317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.087348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.087686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.088756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.088790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.089314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.089347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.089674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.089684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.091363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.091397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.091432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.091463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.091701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.093264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.093298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.094870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.094903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.095198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.095207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.098901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.098937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.098972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.099004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.099344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.100993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.101027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.101058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.101089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.101459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.101468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.102922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.102956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.102987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.103668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.107757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.108153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.108169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.109710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.109743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.109777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.109808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.110046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.110081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.110113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.110145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.110179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.837 [2024-06-10 13:58:04.110461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.110469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.114930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.116903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.117222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.117231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.121957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.121992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.122754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.124862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.125097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.125108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.129853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.130129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.130138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.131692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.131725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.131756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.131787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.132401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.136686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.137020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.137029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.138986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.139018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.139049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.139287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.139296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.143736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.143773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.143804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.838 [2024-06-10 13:58:04.143836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.144073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.144110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.144434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.144467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.145123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.145366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.145376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.147345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.147379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.147410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.147442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.147679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.147713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.148741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.148775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.150365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.150605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.150615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.155305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.155346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.155377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.155408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.155812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.155846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.156171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.156204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.157573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.157831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.157841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.159247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.160871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.160905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.160936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.161176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.161226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.162696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.162729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.162761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.163017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.163026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.166607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.166644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.167931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.167965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.168206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.169757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.169790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.169822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.169854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.170158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.170172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.171563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.172844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.172877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.172908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.173933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.177887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.177924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.177955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.179230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.179470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.181030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.181064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.181096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.181126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.181466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.181476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.183031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.183066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.183389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.183422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.183766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.185387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.185420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.185455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.185486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.185722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.185731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.189965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.190655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.190688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.190719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.190957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.191285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.191318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.839 [2024-06-10 13:58:04.191352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.191384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.191735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.191744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.195131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.195169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.195201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.196591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.196852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.197937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.197971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.198002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.199284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.199523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.199534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.203835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.205127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.205160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.205403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.205411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.209737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.210959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.211800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.212077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.212086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.215421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.217001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.217035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.217067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.217309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.217345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.218922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.218979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.220447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.220809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.220817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.226256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.226297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.226332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.227896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.228138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.228178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.228842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.228875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.230169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.230408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.230418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.234668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.235403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.235436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.235468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.235821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.235856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.237262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.237295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.238712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.238951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.238960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.244406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.245205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.245239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.245560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.245821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.245856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.246578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.246611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.246932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.247188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.247201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.252090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.253579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.254580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.255341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.255683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.840 [2024-06-10 13:58:04.255719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.256664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.256697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.257350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.257694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.257703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.263530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.265120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.266559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.267606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.267883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.267919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.268243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.268278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.269420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.269761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.269769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.274629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.276252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.277799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.279194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.279485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.279520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.280462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.280496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.280820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.841 [2024-06-10 13:58:04.281080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.281089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.285549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.287186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.288790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.290390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.290630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.290667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.291354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.291387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.292301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.292608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.292618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.297793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.298653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.300240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.301861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.302100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.302137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.303747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.304471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.305491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.305888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:49.842 [2024-06-10 13:58:04.305897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.310439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.311507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.313057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.314494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.314735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.316297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.316844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.318198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.318519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.318840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.318849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.324453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.325727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.327013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.328563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.328804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.329590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.331229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.331551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.331872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.332111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.332120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.336103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.337382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.338928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.340457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.340766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.341915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.342509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.342832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.344176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.344501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.344510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.349431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.351018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.351721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.352744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.353098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.353610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.354838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.355160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.356009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.356299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.356308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.362115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.362718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.364230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.105 [2024-06-10 13:58:04.364555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.364917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.366536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.366860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.367184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.368623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.368865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.368874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.374174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.375613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.375936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.376260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.376505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.377133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.377459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.378897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.380199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.380438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.380447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.385220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.386771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.387096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.387476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.387716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.388042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.388367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.388688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.389015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.389259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.389268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.391835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.393388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.393711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.394032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.394274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.394781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.395104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.395428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.395751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.395995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.396004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.399373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.400500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.401125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.401450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.401724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.402631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.402954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.403278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.403601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.403934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.403943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.408151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.408947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.409902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.410228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.410504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.411704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.412029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.412353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.412676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.413051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.413060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.418216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.418737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.419978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.420307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.420630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.422093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.422420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.422741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.423066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.423439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.423448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.429582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.430043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.431342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.431665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.431986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.433539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.433573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.433897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.433929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.434272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.434281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.436722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.437048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.437381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.437752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.437993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.438324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.438358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.438678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.106 [2024-06-10 13:58:04.438711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.438949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.438958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.441555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.443209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.443533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.443855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.444169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.444495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.444528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.446143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.446180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.446541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.446550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.450448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.450486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.450807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.451133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.451380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.451725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.451758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.452078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.452403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.452850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.452862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.458985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.459316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.459349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.460838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.461293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.461328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.461649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.462952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.463394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.463720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.463731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.467584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.467621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.467942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.468267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.468644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.468679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.469590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.470431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.470755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.471031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.471040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.474244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.474569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.475252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.475285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.475559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.475595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.475919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.476243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.476567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.476976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.476985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.482834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.483176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.483210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.484497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.484975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.485011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.485336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.486789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.487112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.487451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.487460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.489917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.489954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.490279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.490602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.491001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.491036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.492457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.492787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.493108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.493355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.493364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.495741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.496068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.497322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.497355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.497792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.497829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.498149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.498474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.498507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.498828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.498838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.504745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.107 [2024-06-10 13:58:04.505074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.505107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.506478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.506904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.507234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.508300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.508332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.508877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.509226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.509239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.512654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.512691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.514091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.514447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.514802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.515127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.515159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.515483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.515518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.515756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.515765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.518991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.519028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.520204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.520772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.521123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.522053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.522086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.522773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.522806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.523132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.523141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.525957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.526581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.526903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.526935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.527228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.527555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.527588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.528965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.528998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.529512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.529522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.533138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.533177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.534771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.535649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.535898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.537462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.537498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.539054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.539087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.539358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.539366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.544754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.544797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.546264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.546298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.546537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.548116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.548157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.549618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.549651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.549948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.549957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.553469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.553505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.553536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.553568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.553879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.555474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.555507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.555827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.555858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.556160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.556173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.560208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.560245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.560276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.560311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.560551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.562113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.562147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.562711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.562744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.562984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.562993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.565183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.565219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.565251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.565282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.565520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.567089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.567123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.108 [2024-06-10 13:58:04.567976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.568008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.568248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.568257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.571797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.571833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.571867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.571898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.572152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.573153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.573190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.573513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.573545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.573807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.573815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.577207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.577253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.577285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.577315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.109 [2024-06-10 13:58:04.577552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.579108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.579143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.579178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.579210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.579451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.579460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.582975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.583215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.583226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.587882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.587930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.587962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.587993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.588725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.591602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.591638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.591672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.591703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.591938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.591973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.592005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.592036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.592067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.592600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.592609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.596954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.596997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.597801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.601992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.606974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.372 [2024-06-10 13:58:04.607005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.607036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.607384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.607393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.611996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.612236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.612245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.615826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.616195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.616203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.620661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.620697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.620728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.620775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.621425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.625948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.630622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.630675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.630713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.630744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.630983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.631018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.631049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.631080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.631112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.631461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.631471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.634726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.634763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.634794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.634825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.635062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.635100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.635859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.635892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.637297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.637540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.637550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.641654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.641691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.641721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.641752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.642205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.642243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.642567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.642599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.644156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.644407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.644416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.648706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.648742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.648773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.648804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.649113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.649149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.650779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.650812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.651132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.651479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.651488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.656441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.657742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.373 [2024-06-10 13:58:04.657775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.657807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.658045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.658080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.659365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.659398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.659430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.659666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.659675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.663580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.663617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.664639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.664672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.664926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.666483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.666517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.666548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.666583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.666823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.666831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.670582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.670908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.670941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.670973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.671402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.671729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.671762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.671793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.671827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.672070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.672079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.676933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.676971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.677003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.678030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.678413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.678739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.678772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.678805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.678836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.679275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.679284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.682623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.682659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.683946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.683980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.684221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.685778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.685815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.685847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.685878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.686207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.686217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.688463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.690089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.690122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.690153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.690395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.691328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.691362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.691393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.691424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.691691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.691700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.695489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.695526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.695558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.696842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.697083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.698643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.698677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.698708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.699811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.700058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.700068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.703982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.704019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.704342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.704379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.704693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.704728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.704760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.706037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.706070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.706310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.706319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.710564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.710936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.710969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.711001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.711385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.711421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.711741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.374 [2024-06-10 13:58:04.711772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.712091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.712425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.712434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.716373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.717934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.717967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.717999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.718239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.718274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.718712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.718745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.719065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.719406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.719415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.723674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.723714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.723746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.725033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.725278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.725319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.726878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.726911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.727353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.727753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.727762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.731964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.733462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.733496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.733528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.733782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.733817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.735104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.735138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.736682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.736923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.736933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.742045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.743673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.743707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.745088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.745367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.745403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.746691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.746723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.748263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.748509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.748518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.753422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.754996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.756623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.757600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.757901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.757936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.759494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.759526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.761082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.761344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.761354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.765048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.766612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.767277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.768572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.768813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.768850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.770413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.770446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.771573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.771896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.771905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.776026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.777410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.778619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.779901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.780141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.780180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.781734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.781770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.782310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.782797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.782806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.787130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.788422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.789874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.791099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.791553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.791588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.791908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.791939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.792263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.792594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.792603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.798061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.375 [2024-06-10 13:58:04.799631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.801267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.801589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.801986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.802022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.802609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.803773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.804095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.804412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.804421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.809804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.811369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.811857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.812182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.812538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.812866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.813191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.814751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.816192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.816432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.816440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.821453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.821778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.822099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.822422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.822705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.823993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.825583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.827138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.827945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.828268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.828277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.832611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.832940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.833267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.833593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.834019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.834348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.834672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.834993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.835317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.835658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.835667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.838657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.838983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.839311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.839633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.839946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.840274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.840597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.840919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.841243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.841636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.841645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.844411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.376 [2024-06-10 13:58:04.844738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.640 [2024-06-10 13:58:04.845061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.845409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.845733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.846057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.846383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.846704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.847025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.847458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.847467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.850463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.850806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.851128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.851452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.851853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.852183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.852505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.852827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.853147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.853429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.853438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.855499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.855826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.856148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.856472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.856791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.857116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.857441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.857762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.858087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.858391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.858401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.860881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.861208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.861530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.861851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.862226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.862552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.862874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.863199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.863521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.863871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.863880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.866451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.866775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.867114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.867438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.867814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.868140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.868466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.868787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.869113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.869470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.869480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.871855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.872181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.872504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.872825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.873165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.873491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.873813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.874134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.874457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.874808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.874817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.877199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.877527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.877850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.878175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.878494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.878820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.879142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.879467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.879792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.880147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.880156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.882447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.882786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.883108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.883450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.883776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.884100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.884137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.884461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.884494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.884945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.884954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.886981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.887308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.887630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.887953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.888296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.641 [2024-06-10 13:58:04.888621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.888654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.888975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.889007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.889422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.889432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.891794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.892120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.892444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.892765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.893104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.893432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.893466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.893787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.893819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.894186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.894195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.897589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.897625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.898664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.899348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.899761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.900085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.900117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.900441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.900762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.901061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.901070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.903122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.903449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.903484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.903805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.904193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.904228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.904549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.904871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.905193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.905535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.905543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.907720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.907755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.908074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.908398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.908805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.908840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.909159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.909484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.909807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.910102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.910110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.911523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.913147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.914752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.914791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.915031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.915065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.916688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.917009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.917333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.917650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.917658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.920987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.922097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.922130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.923752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.924001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.924037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.925596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.927165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.927489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.927822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.927831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.931358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.931393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.932956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.934002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.934246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.934287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.935580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.937141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.938712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.939040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.939052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.941296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.942593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.944153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.944189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.944427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.944463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.945678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.947089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.947121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.947440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.642 [2024-06-10 13:58:04.947449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.949450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.949774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.949808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.950542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.950829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.952411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.953978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.954011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.955151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.955398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.955408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.957210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.957246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.957568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.957889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.958258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.959871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.959905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.961510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.961543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.961782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.961790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.964777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.964812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.965562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.965888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.966238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.966563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.966595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.966914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.966946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.967186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.967195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.968616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.969911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.971463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.971495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.971735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.973000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.973033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.973356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.973388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.973740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.973750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.977511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.977546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.979113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.979931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.980187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.981743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.981776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.983325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.983358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.983626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.983635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.986021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.986055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.987682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.987716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.987954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.989515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.989548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.991168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.991201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.991700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.991709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.993176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.993210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.993252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.993284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.993744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.994068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.994101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.994441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.994475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.994819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.994828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.996191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.996225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.996260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.996291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.996547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.997835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.997868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.999415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.999447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.999684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:04.999693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:05.001649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:05.001683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:05.001714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:05.001746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.643 [2024-06-10 13:58:05.002065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.003427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.003461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.005018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.005050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.005290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.005299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.006717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.006751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.006785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.006816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.007052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.007604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.007637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.007958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.007989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.008319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.008329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.010364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.010399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.010430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.010462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.010697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.011909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.011942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.011974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.012021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.012260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.012269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.013674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.013707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.013738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.013769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.014581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.016878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.017202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.017211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.018586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.018619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.018653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.018692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.019624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.022997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.023007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.024970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.025301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.025310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.026916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.026949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.026981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.027011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.027250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.027285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.027317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.027348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.644 [2024-06-10 13:58:05.027379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.027615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.027624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.029580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.030002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.030011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.031717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.031751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.031782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.031815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.032441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.033853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.033887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.033918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.033949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.034673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.037959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.039869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.040184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.040193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.042428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.042461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.042492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.042524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.042760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.042798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.044270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.044317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.045901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.046140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.046149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.047585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.047619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.047650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.047681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.047983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.048019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.048357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.048389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.645 [2024-06-10 13:58:05.048711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.049102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.049114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.050559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.050602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.050633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.050664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.050901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.050936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.052036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.052069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.053379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.053622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.053632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.055869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.056195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.056227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.056259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.056590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.056625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.057919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.057951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.057982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.058222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.058231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.059645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.059679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.061217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.061250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.061487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.062082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.062115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.062150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.062184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.062593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.062602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.064417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.065978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.066011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.066042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.066283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.067402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.067435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.067466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.067498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.067735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.067744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.069458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.069493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.069525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.069846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.070218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.070837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.070870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.070901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.070932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.071246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.071256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.072654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.072688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.074216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.074250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.074491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.076066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.076100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.076131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.076165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.076551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.076560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.078593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.079878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.079911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.079942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.080184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.081753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.081786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.081817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.081848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.082250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.082259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.083907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.083941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.083972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.084296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.084632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.084955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.084989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.085020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.086214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.086515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.086523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.087994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.088028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.646 [2024-06-10 13:58:05.089650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.089684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.089922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.089957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.089990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.091425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.091458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.091769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.091777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.093807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.095102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.095135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.095169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.095407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.095442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.097001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.097033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.097706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.097946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.097955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.099408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.099732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.099764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.099795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.100183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.100218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.100538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.100570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.101871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.102135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.102144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.104978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.105014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.105046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.106589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.106830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.106866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.107189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.107222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.107542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.107890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.107898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.109449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.111016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.111049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.111081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.111366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.647 [2024-06-10 13:58:05.111401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.113007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.113041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.114607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.114847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.114856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.117201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.117733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.117766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.119307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.119546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.119582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.120717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.120750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.122024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.122267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.122276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.124960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.125286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.126742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.128321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.128559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.128595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.130232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.130264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.131399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.131676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.131684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.133507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.133832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.134153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.135745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.135999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.136035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.137584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.137617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.139187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.139593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.139601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.141240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.141563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.141884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.142207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.142502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.142537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.143838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.143870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.145426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.145665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.145673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.148589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.148912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.149238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.149559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.149969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.150006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.150497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.150530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.151807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.152046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.152055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.155173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.156740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.912 [2024-06-10 13:58:05.157763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.158085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.158420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.158455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.158775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.159096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.159419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.159695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.159704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.161644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.161968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.162293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.162614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.163007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.163334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.163655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.163975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.164297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.164636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.164646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.166999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.167325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.167646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.167966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.168326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.168650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.168975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.169298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.169619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.170068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.170078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.172329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.172657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.172978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.173302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.173590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.173916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.174241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.174561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.174882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.175195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.175205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.177414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.177741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.178063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.178387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.178752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.179077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.179400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.179721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.180042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.180471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.180481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.183143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.183470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.183792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.184113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.184470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.184796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.185118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.185442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.185764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.186093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.186102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.189051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.189377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.189697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.190017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.190297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.190623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.190944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.191268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.191587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.191926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.191938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.194570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.194897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.195222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.195544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.195993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.196321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.196643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.196964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.197292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.197625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.197634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.200339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.200663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.200984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.201307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.201619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.201944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.202269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.913 [2024-06-10 13:58:05.202592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.202920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.203252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.203261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.205561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.205885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.206210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.206531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.206861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.207198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.207527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.207871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.208194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.208642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.208651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.210720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.211043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.211370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.211691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.211984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.212311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.212633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.212954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.213278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.213556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.213566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.215649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.215977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.216301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.216623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.216955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.217283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.217605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.217927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.218255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.218612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.218622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.222997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.223331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.224771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.225094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.225428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.225755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.225788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.226107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.226141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.226479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.226488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.228518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.228841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.229165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.229487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.229835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.230159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.230195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.230514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.230546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.230882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.230892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.233575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.233899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.234244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.234575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.235000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.235327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.235361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.235681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.235714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.236041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.236050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.238143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.238182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.238627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.239924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.240168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.241743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.241799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.243292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.244423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.244686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.244695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.246458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.246781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.246813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.247134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.247373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.247408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.248700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.250268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.251839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.252295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.252304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.914 [2024-06-10 13:58:05.253947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.253982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.254305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.254627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.254977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.255012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.255889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.257182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.258744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.258984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.258993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.260414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.261998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.262326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.262358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.262695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.262730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.263050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.263373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.264167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.264436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.264445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.267084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.268634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.268667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.270231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.270595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.270648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.270969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.271294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.271615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.271946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.271955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.273936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.273970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.275262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.276810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.277050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.277086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.278368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.278688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.279009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.279324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.279333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.281120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.282680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.283392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.283425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.283662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.283697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.285108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.286662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.286696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.286934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.286943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.289647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.290928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.290960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.292519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.292761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.294086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.295385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.295417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.296743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.296983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.296997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.298946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.298981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.299516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.300808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.301048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.302662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.302698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.304059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.304091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.304408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.304417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.306159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.306196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.306517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.306854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.307204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.308472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.308505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.309802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.309835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.310073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.310081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.311485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.313039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.314261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.314294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.314659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.915 [2024-06-10 13:58:05.314984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.315017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.315340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.315372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.315735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.315743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.317900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.317935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.319364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.320925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.321172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.322787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.322820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.323140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.323175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.323519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.323527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.325418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.325458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.327018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.327051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.327292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.328278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.328311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.329865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.329898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.330137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.330146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.331792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.331826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.331858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.331889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.332218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.332543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.332575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.333725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.333757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.334077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.334085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.335468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.335506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.335548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.335579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.335817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.337444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.337477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.338854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.338887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.339334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.339343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.341414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.341447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.341478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.341509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.341813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.343361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.343394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.344946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.344979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.345439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.345448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.346787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.346821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.346855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.346886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.347306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.347630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.347663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.347984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.348016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.348358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.348370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.349750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.349783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.916 [2024-06-10 13:58:05.349815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.349846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.350175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.351677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.351715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.351746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.351777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.352013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.352021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.353858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.353891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.353923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.353954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.354826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.356997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.358590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.358624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.358658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.358689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.359503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.360945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.360978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.361746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.363696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.364040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.364049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.365974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.366005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.366358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.366367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.367718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.367752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.367786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.367818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.368655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.370792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.370825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.370857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.370909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.371147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.371187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.917 [2024-06-10 13:58:05.371224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.371255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.371286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.371524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.371533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.372922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.372955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.372986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.373886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.375972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.376211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.376220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.377639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.377673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.377707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.377740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.377976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.378010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.378043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.378074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.378105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.378382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.378391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.380358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.380392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.380423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.380453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.380720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.380757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.382305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:50.918 [2024-06-10 13:58:05.382338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.383893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.384201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.384210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.385571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.385609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.385640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.385671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.386039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.386074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.386397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.386431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.386753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.387099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.387107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.181 [2024-06-10 13:58:05.388504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.388538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.388568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.388600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.388941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.388983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.390438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.390473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.392036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.392277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.392286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.394199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.394521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.394553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.394585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.394842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.394878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.396167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.396200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.396231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.396468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.396477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.397888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.397922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.399463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.399496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.399736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.400059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.400095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.400126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.400159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.400526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.400535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.402474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.404032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.404065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.404096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.404339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.405029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.405061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.405092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.405123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.405446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.405456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.407188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.407222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.407253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.407575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.407921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.408897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.408930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.408961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.408992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.409250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.409259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.410721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.410755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.412288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.412321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.412563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.414141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.414177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.414210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.414241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.414632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.414641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.416335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.417628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.417661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.417693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.417929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.419497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.419530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.419561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.419592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.419935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.419944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.421582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.421617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.421648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.421968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.422272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.422597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.422628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.422660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.423871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.424168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.424177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.425613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.425650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.427276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.427309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.182 [2024-06-10 13:58:05.427549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.427584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.427616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.429032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.429071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.429478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.429487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.431504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.432794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.432827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.432858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.433095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.433131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.434064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.434097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.435620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.435859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.435868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.437431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.437753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.437785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.437817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.438117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.438154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.439429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.439462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.441017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.441260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.441274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.444187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.444222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.444253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.444575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.444965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.445001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.445323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.445356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.445675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.446022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.446032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.447425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.448643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.448675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.448706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.449046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.449082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.450648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.450682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.452231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.452520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.452529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.457037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.458580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.458613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.460174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.460421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.460456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.461653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.461686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.462987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.463231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.463241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.465679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.466005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.467502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.469154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.469421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.469462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.471017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.471053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.472257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.472557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.472566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.474365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.474689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.475013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.476631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.476884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.476921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.478505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.478539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.480167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.480677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.480686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.482359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.482685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.483007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.483335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.483675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.483721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.484043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.484077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.484401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.183 [2024-06-10 13:58:05.484771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.484781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.487098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.487427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.487750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.488073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.488417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.488453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.488774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.488808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.489143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.489596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.489606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.491972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.492302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.492625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.492950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.493300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.493337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.493658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.493980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.494321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.494640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.494650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.497144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.497471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.497793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.498119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.498407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.498733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.499056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.499379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.499701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.500034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.500046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.502497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.502824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.503146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.503475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.503863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.504191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.504513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.504834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.505155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.505603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.505612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.507995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.508335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.508658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.508996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.509337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.509660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.509983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.510306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.510629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.510958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.510967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.513001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.513330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.513652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.513973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.514316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.514641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.514962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.515286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.515607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.515931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.515941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.518093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.518420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.518741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.519063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.519352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.519677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.519999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.520328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.520649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.521000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.521016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.523158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.523485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.523807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.524128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.524539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.524864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.525188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.525511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.525832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.526247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.526256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.184 [2024-06-10 13:58:05.529003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.529333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.529656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.529990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.530309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.530636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.530959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.531283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.531608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.532029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.532038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.534366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.534710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.535031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.535355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.535693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.536018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.536345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.536667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.536987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.537315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.537324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.539753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.540079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.540402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.540725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.541236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.541561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.541887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.542213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.542535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.542867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.542876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.545678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.546009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.547525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.547848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.548087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.548511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.548835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.549156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.549482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.549817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.549827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.552628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.552957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.553282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.553603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.553995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.554322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.554644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.554965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.555292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.555573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.555583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.558366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.558692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.559013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.559337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.559641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.559966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.559999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.560325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.560358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.560763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.560773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.563317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.564602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.566160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.567714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.567957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.569023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.569056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.570330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.570363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.570600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.570608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.572606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.572932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.574464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.576077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.576321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.577892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.577926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.578874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.578907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.579200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.579210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.580963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.580998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.581324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.581650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.581911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.185 [2024-06-10 13:58:05.583210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.583244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.584799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.586361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.586791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.586800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.588483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.588808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.588840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.589160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.589497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.589533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.590360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.591648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.593206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.593447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.593456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.596394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.596430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.596750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.597073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.597429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.597482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.597802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.598125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.599622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.599863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.599876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.601339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.602893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.604342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.604376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.604763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.604798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.605119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.605454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.605777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.606119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.606128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.608226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.609716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.609752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.611306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.611547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.611590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.613083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.613410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.613733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.614023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.614032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.617651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.617687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.618536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.620105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.620348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.620385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.621946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.623596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.623924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.624305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.624315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.626105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.627646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.629201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.629235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.629528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.629565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.630895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.632182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.632215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.632452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.632461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.634791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.635128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.635165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.636444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.636685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.638303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.186 [2024-06-10 13:58:05.639862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.639899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.641102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.641365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.641375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.643200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.643236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.643557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.643879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.644120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.645413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.645447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.647004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.647037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.647277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.647287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.650104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.650141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.650471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.650796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.651167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.651491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.651524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.652693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.652727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.653014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.187 [2024-06-10 13:58:05.653024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.451 [2024-06-10 13:58:05.654444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.451 [2024-06-10 13:58:05.656065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.451 [2024-06-10 13:58:05.657616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.451 [2024-06-10 13:58:05.657649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.657887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.658215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.658249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.658568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.658601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.658889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.658897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.661995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.662031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.662891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.664462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.664704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.666236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.666270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.667883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.667917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.668252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.668261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.671087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.671122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.672403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.672436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.672673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.674228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.674262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.675262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.675295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.675534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.675542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.676996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.677029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.677061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.677092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.677462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.677789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.677823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.678145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.678181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.678467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.678476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.679879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.679914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.679946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.679977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.680218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.681715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.681750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.683296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.683329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.683567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.683576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.685909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.685943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.685975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.686006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.686254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.687542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.687575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.689133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.689169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.689406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.689415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.690874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.690908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.690942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.690974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.691343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.691667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.691700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.692020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.692053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.692455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.692465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.694024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.694058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.694089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.694120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.694359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.695032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.695065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.695097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.695127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.695414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.695423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.696890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.696924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.696955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.696988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.697342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.697381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.697412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.452 [2024-06-10 13:58:05.697443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.697489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.697848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.697857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.699869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.700107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.700115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.701558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.701593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.701627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.701659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.702468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.704955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.705296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.705305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.706664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.706698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.706729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.706760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.707655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.709822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.709857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.709888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.709920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.710535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.711977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.712962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.714670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.714704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.714739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.714770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.715389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.716799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.716833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.716864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.716895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.717570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.719231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.453 [2024-06-10 13:58:05.719265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.719942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.721836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.722168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.722177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.725024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.725058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.725089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.725120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.725442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.725480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.727034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.727067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.728612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.728899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.728908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.730260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.730294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.730326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.730357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.730679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.730715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.731038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.731070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.731397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.731719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.731728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.733171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.733206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.733237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.733268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.733610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.733647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.735192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.735226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.736788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.737029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.737038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.739123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.739450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.739483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.739514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.739790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.739826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.741102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.741136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.741170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.741409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.741418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.742888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.742923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.744473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.744506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.744744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.745072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.745106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.745137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.745172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.745498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.745507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.747035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.748585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.748619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.748650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.748889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.749644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.749678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.749710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.749740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.750034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.750043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.751831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.751868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.751900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.752224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.752579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.753581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.454 [2024-06-10 13:58:05.753615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.753647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.753677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.753965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.753974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.755486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.755525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.757081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.757120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.757363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.758805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.758839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.758871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.758902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.759214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.759224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.761158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.762720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.762755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.762788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.763957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.765846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.765882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.765914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.766239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.766513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.767804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.767837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.767868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.769420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.769661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.769670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.771125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.771206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.772369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.772402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.772845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.772881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.772913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.773237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.773270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.773588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.773597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.775668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.777228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.777262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.777294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.777595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.777631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.779219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.779253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.780889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.781130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.781140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.782781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.783104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.783137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.783173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.783461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.783497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.784772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.784804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.786352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.786596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.786606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.789620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.789656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.789688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.790009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.790354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.790390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.790710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.790742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.791062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.791363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.791373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.792826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.794446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.794480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.794512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.794749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.794784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.796400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.796434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.797856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.798200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.798209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.455 [2024-06-10 13:58:05.801970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.803546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.803579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.805142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.805428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.805464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.806726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.806763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.808063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.808306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.808315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.811158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.811487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.811808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.812135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.812560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.812596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.812916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.812964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.813287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.813611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.813620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.815826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.816153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.816478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.816801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.817199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.817235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.817555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.817590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.817910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.818243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.818252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.820517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.820842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.821166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.821489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.821803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.821838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.822158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.822196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.822517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.822883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.822895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.825052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.825380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.825702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.826023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.826346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.826384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.826704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.826737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.827058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.827470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.827479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.829685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.830018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.830342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.830666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.830994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.831029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.831354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.831677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.831997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.832322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.832332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.834631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.834957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.835286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.835608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.835964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.836291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.836613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.836933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.837258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.456 [2024-06-10 13:58:05.837620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.837630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.839684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.840010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.840337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.840660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.840971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.841304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.841627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.841946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.842271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.842625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.842635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.844992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.845324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.845646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.845968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.846382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.846706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.847035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.847359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.847681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.848011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.848024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.850461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.850787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.851108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.851434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.851798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.852123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.852449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.852770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.853090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.853424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.853434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.855491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.855816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.856136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.856462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.856738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.857063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.857389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.857710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.858030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.858529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.858538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.861101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.861431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.861754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.862076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.862380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.862708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.863030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.863360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.863685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.864069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.864081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.866217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.866545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.866868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.867191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.867555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.867881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.868222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.868542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.868866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.869326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.869336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.871281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.871607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.871928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.872254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.872582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.872910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.873236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.873557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.873879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.874261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.874271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.876278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.876605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.876935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.877274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.877646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.877973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.878300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.878625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.878946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.879279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.879288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.881455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.881781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.882102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.882427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.457 [2024-06-10 13:58:05.882752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.883077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.883404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.883726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.884047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.884402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.884412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.887564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.888683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.889964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.891516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.891757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.892704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.893027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.893352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.893676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.893985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.893994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.896602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.897896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.899453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.901003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.901364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.901694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.901728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.902049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.902081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.902490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.902498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.904724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.906272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.907889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.909451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.909691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.910016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.910050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.910373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.910406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.910779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.910788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.913968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.914675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.915959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.917506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.917748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.919123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.919157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.919483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.919515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.919878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.458 [2024-06-10 13:58:05.919887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.923332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.923373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.924474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.926011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.926344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.927903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.927937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.929498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.929843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.930251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.930260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.933417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.934994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.935027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.935695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.935958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.935994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.937551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.939100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.940364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.940699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.940708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.944024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.944061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.945614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.946867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.947215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.947251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.948534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.950100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.951663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.952078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.952088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.954610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.955895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.957426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.957459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.957698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.957734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.958411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.959697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.961234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.961476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.961492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.963493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.964933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.964967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.966515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.966758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.966794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.968228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.969409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.970695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.970933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.970942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.973230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.973267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.974363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.975645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.975885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.975922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.977482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.978273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.979788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.980027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.980037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.981894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.982222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.720 [2024-06-10 13:58:05.982545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.982587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.982825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.982859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.984249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.985819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.985857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.986096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.986104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.988646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.988972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.989006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.989332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.989855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.990183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.991727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.991761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.993306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.993547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.721 [2024-06-10 13:58:05.993556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:51.981 00:29:51.981 Latency(us) 00:29:51.981 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:51.981 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x0 length 0x100 00:29:51.981 crypto_ram : 5.76 44.44 2.78 0.00 0.00 2789560.32 346030.08 2278905.17 00:29:51.981 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x100 length 0x100 00:29:51.981 crypto_ram : 5.84 43.86 2.74 0.00 0.00 2836398.08 267386.88 2390753.28 00:29:51.981 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x0 length 0x100 00:29:51.981 crypto_ram1 : 5.76 44.43 2.78 0.00 0.00 2693891.41 283115.52 2083170.99 00:29:51.981 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x100 length 0x100 00:29:51.981 crypto_ram1 : 5.84 43.85 2.74 0.00 0.00 2735957.33 267386.88 2195019.09 00:29:51.981 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x0 length 0x100 00:29:51.981 crypto_ram2 : 5.54 308.24 19.27 0.00 0.00 373833.80 23374.51 594193.07 00:29:51.981 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x100 length 0x100 00:29:51.981 crypto_ram2 : 5.61 295.36 18.46 0.00 0.00 389819.75 2239.15 594193.07 00:29:51.981 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x0 length 0x100 00:29:51.981 crypto_ram3 : 5.63 319.77 19.99 0.00 0.00 350859.61 3686.40 443897.17 00:29:51.981 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:51.981 Verification LBA range: start 0x100 length 0x100 00:29:51.981 crypto_ram3 : 5.70 306.61 19.16 0.00 0.00 363693.49 36044.80 323310.93 00:29:51.981 =================================================================================================================== 00:29:51.981 Total : 1406.55 87.91 0.00 0.00 678091.17 2239.15 2390753.28 00:29:52.241 00:29:52.241 real 0m8.692s 00:29:52.241 user 0m16.728s 00:29:52.241 sys 0m0.281s 00:29:52.241 13:58:06 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:52.241 13:58:06 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:52.241 ************************************ 00:29:52.241 END TEST bdev_verify_big_io 00:29:52.241 ************************************ 00:29:52.241 13:58:06 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:52.241 13:58:06 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:29:52.241 13:58:06 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:52.241 13:58:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:52.501 ************************************ 00:29:52.501 START TEST bdev_write_zeroes 00:29:52.501 ************************************ 00:29:52.501 13:58:06 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:52.501 [2024-06-10 13:58:06.804130] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:29:52.501 [2024-06-10 13:58:06.804183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1751059 ] 00:29:52.501 [2024-06-10 13:58:06.891561] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:52.501 [2024-06-10 13:58:06.959961] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:29:52.761 [2024-06-10 13:58:06.981030] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:52.761 [2024-06-10 13:58:06.989070] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:52.761 [2024-06-10 13:58:06.997076] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:52.761 [2024-06-10 13:58:07.084038] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:55.302 [2024-06-10 13:58:09.216834] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:55.302 [2024-06-10 13:58:09.216883] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:55.302 [2024-06-10 13:58:09.216892] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.302 [2024-06-10 13:58:09.224852] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:55.302 [2024-06-10 13:58:09.224864] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:55.302 [2024-06-10 13:58:09.224870] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.302 [2024-06-10 13:58:09.232873] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:55.302 [2024-06-10 13:58:09.232884] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:55.302 [2024-06-10 13:58:09.232890] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.302 [2024-06-10 13:58:09.240893] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:55.302 [2024-06-10 13:58:09.240904] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:55.302 [2024-06-10 13:58:09.240911] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:55.302 Running I/O for 1 seconds... 00:29:55.873 00:29:55.873 Latency(us) 00:29:55.873 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:55.873 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:55.873 crypto_ram : 1.02 2206.16 8.62 0.00 0.00 57615.78 5215.57 69468.16 00:29:55.873 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:55.873 crypto_ram1 : 1.02 2219.32 8.67 0.00 0.00 57017.03 5133.65 64225.28 00:29:55.873 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:55.873 crypto_ram2 : 1.02 17041.70 66.57 0.00 0.00 7408.15 2266.45 9775.79 00:29:55.873 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:55.873 crypto_ram3 : 1.02 17073.90 66.69 0.00 0.00 7371.64 2266.45 9557.33 00:29:55.873 =================================================================================================================== 00:29:55.873 Total : 38541.08 150.55 0.00 0.00 13145.02 2266.45 69468.16 00:29:56.133 00:29:56.133 real 0m3.814s 00:29:56.133 user 0m3.546s 00:29:56.133 sys 0m0.233s 00:29:56.133 13:58:10 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:56.133 13:58:10 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:56.133 ************************************ 00:29:56.133 END TEST bdev_write_zeroes 00:29:56.133 ************************************ 00:29:56.133 13:58:10 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:56.133 13:58:10 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:29:56.133 13:58:10 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:56.133 13:58:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:56.393 ************************************ 00:29:56.393 START TEST bdev_json_nonenclosed 00:29:56.393 ************************************ 00:29:56.393 13:58:10 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:56.393 [2024-06-10 13:58:10.699532] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:29:56.393 [2024-06-10 13:58:10.699587] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1751741 ] 00:29:56.393 [2024-06-10 13:58:10.791775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:56.654 [2024-06-10 13:58:10.870113] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:29:56.654 [2024-06-10 13:58:10.870170] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:56.654 [2024-06-10 13:58:10.870182] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:56.654 [2024-06-10 13:58:10.870189] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:56.654 00:29:56.654 real 0m0.288s 00:29:56.654 user 0m0.181s 00:29:56.654 sys 0m0.105s 00:29:56.654 13:58:10 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:56.654 13:58:10 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:56.654 ************************************ 00:29:56.654 END TEST bdev_json_nonenclosed 00:29:56.654 ************************************ 00:29:56.654 13:58:10 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:56.654 13:58:10 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:29:56.654 13:58:10 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:56.654 13:58:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:56.654 ************************************ 00:29:56.654 START TEST bdev_json_nonarray 00:29:56.654 ************************************ 00:29:56.654 13:58:11 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:56.654 [2024-06-10 13:58:11.068730] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:29:56.654 [2024-06-10 13:58:11.068780] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1751889 ] 00:29:56.916 [2024-06-10 13:58:11.160151] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:56.916 [2024-06-10 13:58:11.236629] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:29:56.916 [2024-06-10 13:58:11.236690] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:56.916 [2024-06-10 13:58:11.236703] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:56.916 [2024-06-10 13:58:11.236710] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:56.916 00:29:56.916 real 0m0.285s 00:29:56.916 user 0m0.182s 00:29:56.916 sys 0m0.101s 00:29:56.916 13:58:11 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:56.916 13:58:11 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:56.916 ************************************ 00:29:56.916 END TEST bdev_json_nonarray 00:29:56.916 ************************************ 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:29:56.916 13:58:11 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:29:56.916 00:29:56.916 real 1m7.919s 00:29:56.916 user 2m49.610s 00:29:56.916 sys 0m6.328s 00:29:56.916 13:58:11 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:56.916 13:58:11 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:56.916 ************************************ 00:29:56.916 END TEST blockdev_crypto_qat 00:29:56.916 ************************************ 00:29:56.916 13:58:11 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:29:56.916 13:58:11 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:29:56.916 13:58:11 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:56.916 13:58:11 -- common/autotest_common.sh@10 -- # set +x 00:29:57.177 ************************************ 00:29:57.177 START TEST chaining 00:29:57.177 ************************************ 00:29:57.177 13:58:11 chaining -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:29:57.177 * Looking for test storage... 00:29:57.177 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@7 -- # uname -s 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:806f5428-4aec-ec11-9bc7-a4bf01928306 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=806f5428-4aec-ec11-9bc7-a4bf01928306 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:57.177 13:58:11 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:57.177 13:58:11 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:57.177 13:58:11 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:57.177 13:58:11 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.177 13:58:11 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.177 13:58:11 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.177 13:58:11 chaining -- paths/export.sh@5 -- # export PATH 00:29:57.177 13:58:11 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@47 -- # : 0 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:29:57.177 13:58:11 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:29:57.177 13:58:11 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:29:57.177 13:58:11 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:29:57.177 13:58:11 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:29:57.177 13:58:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@296 -- # e810=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@297 -- # x722=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@298 -- # mlx=() 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@336 -- # return 1 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:30:05.361 WARNING: No supported devices were found, fallback requested for tcp test 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:30:05.361 13:58:19 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:30:05.622 Cannot find device "nvmf_tgt_br" 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@155 -- # true 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:30:05.622 Cannot find device "nvmf_tgt_br2" 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@156 -- # true 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:30:05.622 Cannot find device "nvmf_tgt_br" 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@158 -- # true 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:30:05.622 Cannot find device "nvmf_tgt_br2" 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@159 -- # true 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:30:05.622 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@162 -- # true 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:30:05.622 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@163 -- # true 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:30:05.622 13:58:19 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:30:05.622 13:58:20 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:30:05.622 13:58:20 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:30:05.622 13:58:20 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:30:05.622 13:58:20 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:30:05.883 13:58:20 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:30:06.144 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:30:06.144 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.112 ms 00:30:06.144 00:30:06.144 --- 10.0.0.2 ping statistics --- 00:30:06.144 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:06.144 rtt min/avg/max/mdev = 0.112/0.112/0.112/0.000 ms 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:30:06.144 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:30:06.144 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.073 ms 00:30:06.144 00:30:06.144 --- 10.0.0.3 ping statistics --- 00:30:06.144 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:06.144 rtt min/avg/max/mdev = 0.073/0.073/0.073/0.000 ms 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:30:06.144 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:30:06.144 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.041 ms 00:30:06.144 00:30:06.144 --- 10.0.0.1 ping statistics --- 00:30:06.144 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:06.144 rtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@433 -- # return 0 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:30:06.144 13:58:20 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@481 -- # nvmfpid=1756692 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@482 -- # waitforlisten 1756692 00:30:06.144 13:58:20 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@830 -- # '[' -z 1756692 ']' 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:06.144 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:06.144 13:58:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:06.144 [2024-06-10 13:58:20.501137] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:06.144 [2024-06-10 13:58:20.501199] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:06.144 [2024-06-10 13:58:20.583430] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:06.404 [2024-06-10 13:58:20.657690] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:30:06.404 [2024-06-10 13:58:20.657726] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:30:06.404 [2024-06-10 13:58:20.657734] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:30:06.404 [2024-06-10 13:58:20.657740] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:30:06.404 [2024-06-10 13:58:20.657746] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:30:06.404 [2024-06-10 13:58:20.657771] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:30:06.974 13:58:21 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:06.974 13:58:21 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:06.974 13:58:21 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:30:06.974 13:58:21 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:30:06.974 13:58:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:06.974 13:58:21 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:30:06.974 13:58:21 chaining -- bdev/chaining.sh@69 -- # mktemp 00:30:06.974 13:58:21 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.3JRK9tKvWP 00:30:06.974 13:58:21 chaining -- bdev/chaining.sh@69 -- # mktemp 00:30:06.974 13:58:21 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.3sOObzhRIj 00:30:06.974 13:58:21 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:30:06.974 13:58:21 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:30:06.974 13:58:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:06.974 13:58:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:06.974 malloc0 00:30:06.974 true 00:30:06.974 true 00:30:06.974 [2024-06-10 13:58:21.446030] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:07.234 crypto0 00:30:07.234 [2024-06-10 13:58:21.454054] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:30:07.234 crypto1 00:30:07.234 [2024-06-10 13:58:21.462152] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:30:07.234 [2024-06-10 13:58:21.478387] tcp.c: 982:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@85 -- # update_stats 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:07.234 13:58:21 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:07.234 13:58:21 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.3JRK9tKvWP bs=1K count=64 00:30:07.234 64+0 records in 00:30:07.234 64+0 records out 00:30:07.235 65536 bytes (66 kB, 64 KiB) copied, 0.000990199 s, 66.2 MB/s 00:30:07.235 13:58:21 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.3JRK9tKvWP --ob Nvme0n1 --bs 65536 --count 1 00:30:07.235 13:58:21 chaining -- bdev/chaining.sh@25 -- # local config 00:30:07.235 13:58:21 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:07.235 13:58:21 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:07.235 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:07.494 13:58:21 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:07.494 "subsystems": [ 00:30:07.494 { 00:30:07.494 "subsystem": "bdev", 00:30:07.494 "config": [ 00:30:07.494 { 00:30:07.494 "method": "bdev_nvme_attach_controller", 00:30:07.494 "params": { 00:30:07.494 "trtype": "tcp", 00:30:07.494 "adrfam": "IPv4", 00:30:07.494 "name": "Nvme0", 00:30:07.494 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:07.494 "traddr": "10.0.0.2", 00:30:07.494 "trsvcid": "4420" 00:30:07.494 } 00:30:07.494 }, 00:30:07.494 { 00:30:07.494 "method": "bdev_set_options", 00:30:07.494 "params": { 00:30:07.494 "bdev_auto_examine": false 00:30:07.494 } 00:30:07.494 } 00:30:07.494 ] 00:30:07.494 } 00:30:07.494 ] 00:30:07.494 }' 00:30:07.494 13:58:21 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.3JRK9tKvWP --ob Nvme0n1 --bs 65536 --count 1 00:30:07.494 13:58:21 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:07.494 "subsystems": [ 00:30:07.494 { 00:30:07.494 "subsystem": "bdev", 00:30:07.494 "config": [ 00:30:07.494 { 00:30:07.494 "method": "bdev_nvme_attach_controller", 00:30:07.494 "params": { 00:30:07.494 "trtype": "tcp", 00:30:07.494 "adrfam": "IPv4", 00:30:07.494 "name": "Nvme0", 00:30:07.494 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:07.494 "traddr": "10.0.0.2", 00:30:07.494 "trsvcid": "4420" 00:30:07.494 } 00:30:07.494 }, 00:30:07.494 { 00:30:07.494 "method": "bdev_set_options", 00:30:07.494 "params": { 00:30:07.494 "bdev_auto_examine": false 00:30:07.494 } 00:30:07.494 } 00:30:07.494 ] 00:30:07.494 } 00:30:07.494 ] 00:30:07.494 }' 00:30:07.494 [2024-06-10 13:58:21.771334] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:07.494 [2024-06-10 13:58:21.771379] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1756869 ] 00:30:07.494 [2024-06-10 13:58:21.860182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:07.494 [2024-06-10 13:58:21.925171] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:08.015  Copying: 64/64 [kB] (average 12 MBps) 00:30:08.015 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@96 -- # update_stats 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.015 13:58:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.015 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.276 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.276 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.276 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.276 13:58:22 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.276 13:58:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.276 13:58:22 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.3sOObzhRIj --ib Nvme0n1 --bs 65536 --count 1 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@25 -- # local config 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:08.276 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:08.276 "subsystems": [ 00:30:08.276 { 00:30:08.276 "subsystem": "bdev", 00:30:08.276 "config": [ 00:30:08.276 { 00:30:08.276 "method": "bdev_nvme_attach_controller", 00:30:08.276 "params": { 00:30:08.276 "trtype": "tcp", 00:30:08.276 "adrfam": "IPv4", 00:30:08.276 "name": "Nvme0", 00:30:08.276 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:08.276 "traddr": "10.0.0.2", 00:30:08.276 "trsvcid": "4420" 00:30:08.276 } 00:30:08.276 }, 00:30:08.276 { 00:30:08.276 "method": "bdev_set_options", 00:30:08.276 "params": { 00:30:08.276 "bdev_auto_examine": false 00:30:08.276 } 00:30:08.276 } 00:30:08.276 ] 00:30:08.276 } 00:30:08.276 ] 00:30:08.276 }' 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.3sOObzhRIj --ib Nvme0n1 --bs 65536 --count 1 00:30:08.276 13:58:22 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:08.276 "subsystems": [ 00:30:08.276 { 00:30:08.276 "subsystem": "bdev", 00:30:08.276 "config": [ 00:30:08.276 { 00:30:08.276 "method": "bdev_nvme_attach_controller", 00:30:08.276 "params": { 00:30:08.276 "trtype": "tcp", 00:30:08.276 "adrfam": "IPv4", 00:30:08.276 "name": "Nvme0", 00:30:08.276 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:08.276 "traddr": "10.0.0.2", 00:30:08.276 "trsvcid": "4420" 00:30:08.276 } 00:30:08.276 }, 00:30:08.276 { 00:30:08.276 "method": "bdev_set_options", 00:30:08.276 "params": { 00:30:08.276 "bdev_auto_examine": false 00:30:08.276 } 00:30:08.276 } 00:30:08.276 ] 00:30:08.276 } 00:30:08.276 ] 00:30:08.276 }' 00:30:08.276 [2024-06-10 13:58:22.678408] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:08.276 [2024-06-10 13:58:22.678458] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757223 ] 00:30:08.536 [2024-06-10 13:58:22.766255] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:08.536 [2024-06-10 13:58:22.831346] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:08.796  Copying: 64/64 [kB] (average 20 MBps) 00:30:08.796 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:08.796 13:58:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:08.796 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.3JRK9tKvWP /tmp/tmp.3sOObzhRIj 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@25 -- # local config 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:09.057 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:09.057 "subsystems": [ 00:30:09.057 { 00:30:09.057 "subsystem": "bdev", 00:30:09.057 "config": [ 00:30:09.057 { 00:30:09.057 "method": "bdev_nvme_attach_controller", 00:30:09.057 "params": { 00:30:09.057 "trtype": "tcp", 00:30:09.057 "adrfam": "IPv4", 00:30:09.057 "name": "Nvme0", 00:30:09.057 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:09.057 "traddr": "10.0.0.2", 00:30:09.057 "trsvcid": "4420" 00:30:09.057 } 00:30:09.057 }, 00:30:09.057 { 00:30:09.057 "method": "bdev_set_options", 00:30:09.057 "params": { 00:30:09.057 "bdev_auto_examine": false 00:30:09.057 } 00:30:09.057 } 00:30:09.057 ] 00:30:09.057 } 00:30:09.057 ] 00:30:09.057 }' 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:30:09.057 13:58:23 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:09.057 "subsystems": [ 00:30:09.057 { 00:30:09.057 "subsystem": "bdev", 00:30:09.057 "config": [ 00:30:09.057 { 00:30:09.057 "method": "bdev_nvme_attach_controller", 00:30:09.057 "params": { 00:30:09.057 "trtype": "tcp", 00:30:09.057 "adrfam": "IPv4", 00:30:09.057 "name": "Nvme0", 00:30:09.057 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:09.057 "traddr": "10.0.0.2", 00:30:09.057 "trsvcid": "4420" 00:30:09.057 } 00:30:09.057 }, 00:30:09.057 { 00:30:09.057 "method": "bdev_set_options", 00:30:09.057 "params": { 00:30:09.057 "bdev_auto_examine": false 00:30:09.057 } 00:30:09.057 } 00:30:09.057 ] 00:30:09.057 } 00:30:09.057 ] 00:30:09.057 }' 00:30:09.057 [2024-06-10 13:58:23.410172] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:09.057 [2024-06-10 13:58:23.410221] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757259 ] 00:30:09.057 [2024-06-10 13:58:23.496388] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:09.317 [2024-06-10 13:58:23.561541] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:09.578  Copying: 64/64 [kB] (average 9142 kBps) 00:30:09.578 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@106 -- # update_stats 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:09.578 13:58:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:09.578 13:58:23 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:09.578 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:09.578 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:09.578 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:09.838 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.3JRK9tKvWP --ob Nvme0n1 --bs 4096 --count 16 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@25 -- # local config 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:09.838 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:09.838 "subsystems": [ 00:30:09.838 { 00:30:09.838 "subsystem": "bdev", 00:30:09.838 "config": [ 00:30:09.838 { 00:30:09.838 "method": "bdev_nvme_attach_controller", 00:30:09.838 "params": { 00:30:09.838 "trtype": "tcp", 00:30:09.838 "adrfam": "IPv4", 00:30:09.838 "name": "Nvme0", 00:30:09.838 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:09.838 "traddr": "10.0.0.2", 00:30:09.838 "trsvcid": "4420" 00:30:09.838 } 00:30:09.838 }, 00:30:09.838 { 00:30:09.838 "method": "bdev_set_options", 00:30:09.838 "params": { 00:30:09.838 "bdev_auto_examine": false 00:30:09.838 } 00:30:09.838 } 00:30:09.838 ] 00:30:09.838 } 00:30:09.838 ] 00:30:09.838 }' 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.3JRK9tKvWP --ob Nvme0n1 --bs 4096 --count 16 00:30:09.838 13:58:24 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:09.838 "subsystems": [ 00:30:09.838 { 00:30:09.838 "subsystem": "bdev", 00:30:09.838 "config": [ 00:30:09.838 { 00:30:09.838 "method": "bdev_nvme_attach_controller", 00:30:09.838 "params": { 00:30:09.838 "trtype": "tcp", 00:30:09.838 "adrfam": "IPv4", 00:30:09.838 "name": "Nvme0", 00:30:09.838 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:09.838 "traddr": "10.0.0.2", 00:30:09.838 "trsvcid": "4420" 00:30:09.838 } 00:30:09.838 }, 00:30:09.838 { 00:30:09.838 "method": "bdev_set_options", 00:30:09.838 "params": { 00:30:09.838 "bdev_auto_examine": false 00:30:09.838 } 00:30:09.838 } 00:30:09.838 ] 00:30:09.838 } 00:30:09.838 ] 00:30:09.838 }' 00:30:09.838 [2024-06-10 13:58:24.162652] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:09.838 [2024-06-10 13:58:24.162698] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757563 ] 00:30:09.838 [2024-06-10 13:58:24.248475] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:09.838 [2024-06-10 13:58:24.313098] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:10.360  Copying: 64/64 [kB] (average 9142 kBps) 00:30:10.360 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@114 -- # update_stats 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:10.360 13:58:24 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.360 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:10.621 13:58:24 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@117 -- # : 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.3sOObzhRIj --ib Nvme0n1 --bs 4096 --count 16 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@25 -- # local config 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:10.621 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:10.621 13:58:24 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:10.621 13:58:25 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:10.621 "subsystems": [ 00:30:10.621 { 00:30:10.621 "subsystem": "bdev", 00:30:10.621 "config": [ 00:30:10.621 { 00:30:10.621 "method": "bdev_nvme_attach_controller", 00:30:10.621 "params": { 00:30:10.621 "trtype": "tcp", 00:30:10.621 "adrfam": "IPv4", 00:30:10.621 "name": "Nvme0", 00:30:10.621 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:10.621 "traddr": "10.0.0.2", 00:30:10.621 "trsvcid": "4420" 00:30:10.621 } 00:30:10.621 }, 00:30:10.621 { 00:30:10.621 "method": "bdev_set_options", 00:30:10.621 "params": { 00:30:10.621 "bdev_auto_examine": false 00:30:10.621 } 00:30:10.621 } 00:30:10.621 ] 00:30:10.621 } 00:30:10.621 ] 00:30:10.621 }' 00:30:10.621 13:58:25 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.3sOObzhRIj --ib Nvme0n1 --bs 4096 --count 16 00:30:10.621 13:58:25 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:10.621 "subsystems": [ 00:30:10.621 { 00:30:10.621 "subsystem": "bdev", 00:30:10.621 "config": [ 00:30:10.621 { 00:30:10.621 "method": "bdev_nvme_attach_controller", 00:30:10.621 "params": { 00:30:10.621 "trtype": "tcp", 00:30:10.621 "adrfam": "IPv4", 00:30:10.621 "name": "Nvme0", 00:30:10.621 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:10.621 "traddr": "10.0.0.2", 00:30:10.621 "trsvcid": "4420" 00:30:10.621 } 00:30:10.621 }, 00:30:10.621 { 00:30:10.621 "method": "bdev_set_options", 00:30:10.621 "params": { 00:30:10.621 "bdev_auto_examine": false 00:30:10.621 } 00:30:10.621 } 00:30:10.621 ] 00:30:10.621 } 00:30:10.621 ] 00:30:10.621 }' 00:30:10.621 [2024-06-10 13:58:25.049706] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:10.621 [2024-06-10 13:58:25.049756] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1757646 ] 00:30:10.882 [2024-06-10 13:58:25.135694] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:10.882 [2024-06-10 13:58:25.199794] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:11.142  Copying: 64/64 [kB] (average 1391 kBps) 00:30:11.142 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:11.142 13:58:25 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:11.142 13:58:25 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:11.142 13:58:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:11.142 13:58:25 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.3JRK9tKvWP /tmp/tmp.3sOObzhRIj 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.3JRK9tKvWP /tmp/tmp.3sOObzhRIj 00:30:11.403 13:58:25 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@117 -- # sync 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@120 -- # set +e 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:30:11.403 rmmod nvme_tcp 00:30:11.403 rmmod nvme_fabrics 00:30:11.403 rmmod nvme_keyring 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@124 -- # set -e 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@125 -- # return 0 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@489 -- # '[' -n 1756692 ']' 00:30:11.403 13:58:25 chaining -- nvmf/common.sh@490 -- # killprocess 1756692 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@949 -- # '[' -z 1756692 ']' 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@953 -- # kill -0 1756692 00:30:11.403 13:58:25 chaining -- common/autotest_common.sh@954 -- # uname 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1756692 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1756692' 00:30:11.664 killing process with pid 1756692 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@968 -- # kill 1756692 00:30:11.664 13:58:25 chaining -- common/autotest_common.sh@973 -- # wait 1756692 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:11.664 13:58:26 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:30:11.664 13:58:26 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:30:11.664 13:58:26 chaining -- bdev/chaining.sh@132 -- # bperfpid=1758002 00:30:11.664 13:58:26 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1758002 00:30:11.664 13:58:26 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@830 -- # '[' -z 1758002 ']' 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:11.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:11.664 13:58:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:11.925 [2024-06-10 13:58:26.166951] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:11.925 [2024-06-10 13:58:26.167002] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1758002 ] 00:30:11.925 [2024-06-10 13:58:26.258032] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:11.925 [2024-06-10 13:58:26.353279] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:12.869 13:58:27 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:12.869 13:58:27 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:12.869 13:58:27 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:30:12.869 13:58:27 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:12.869 13:58:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:12.869 malloc0 00:30:12.869 true 00:30:12.869 true 00:30:12.869 [2024-06-10 13:58:27.168322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:12.869 crypto0 00:30:12.869 [2024-06-10 13:58:27.176343] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:30:12.869 crypto1 00:30:12.869 13:58:27 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:12.869 13:58:27 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:12.869 Running I/O for 5 seconds... 00:30:18.157 00:30:18.157 Latency(us) 00:30:18.157 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:18.157 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:30:18.157 Verification LBA range: start 0x0 length 0x2000 00:30:18.157 crypto1 : 5.01 13199.19 51.56 0.00 0.00 19340.27 1358.51 14199.47 00:30:18.157 =================================================================================================================== 00:30:18.157 Total : 13199.19 51.56 0.00 0.00 19340.27 1358.51 14199.47 00:30:18.157 0 00:30:18.157 13:58:32 chaining -- bdev/chaining.sh@146 -- # killprocess 1758002 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@949 -- # '[' -z 1758002 ']' 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@953 -- # kill -0 1758002 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@954 -- # uname 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1758002 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1758002' 00:30:18.157 killing process with pid 1758002 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@968 -- # kill 1758002 00:30:18.157 Received shutdown signal, test time was about 5.000000 seconds 00:30:18.157 00:30:18.157 Latency(us) 00:30:18.157 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:18.157 =================================================================================================================== 00:30:18.157 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@973 -- # wait 1758002 00:30:18.157 13:58:32 chaining -- bdev/chaining.sh@152 -- # bperfpid=1759070 00:30:18.157 13:58:32 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1759070 00:30:18.157 13:58:32 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@830 -- # '[' -z 1759070 ']' 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:18.157 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:18.157 13:58:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:18.157 [2024-06-10 13:58:32.562681] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:18.157 [2024-06-10 13:58:32.562732] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1759070 ] 00:30:18.418 [2024-06-10 13:58:32.650015] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:18.418 [2024-06-10 13:58:32.715007] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:18.988 13:58:33 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:18.988 13:58:33 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:18.988 13:58:33 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:30:18.988 13:58:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:18.988 13:58:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:19.249 malloc0 00:30:19.249 true 00:30:19.249 true 00:30:19.249 [2024-06-10 13:58:33.522860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:30:19.249 [2024-06-10 13:58:33.522898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:19.249 [2024-06-10 13:58:33.522910] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x165b630 00:30:19.249 [2024-06-10 13:58:33.522917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:19.249 [2024-06-10 13:58:33.523832] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:19.249 [2024-06-10 13:58:33.523855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:30:19.249 pt0 00:30:19.249 [2024-06-10 13:58:33.530887] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:19.249 crypto0 00:30:19.249 [2024-06-10 13:58:33.538906] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:30:19.249 crypto1 00:30:19.249 13:58:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:19.249 13:58:33 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:19.249 Running I/O for 5 seconds... 00:30:24.539 00:30:24.539 Latency(us) 00:30:24.539 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:24.539 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:30:24.539 Verification LBA range: start 0x0 length 0x2000 00:30:24.539 crypto1 : 5.01 10421.81 40.71 0.00 0.00 24502.52 5679.79 14745.60 00:30:24.539 =================================================================================================================== 00:30:24.539 Total : 10421.81 40.71 0.00 0.00 24502.52 5679.79 14745.60 00:30:24.539 0 00:30:24.539 13:58:38 chaining -- bdev/chaining.sh@167 -- # killprocess 1759070 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@949 -- # '[' -z 1759070 ']' 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@953 -- # kill -0 1759070 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@954 -- # uname 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1759070 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1759070' 00:30:24.539 killing process with pid 1759070 00:30:24.539 13:58:38 chaining -- common/autotest_common.sh@968 -- # kill 1759070 00:30:24.539 Received shutdown signal, test time was about 5.000000 seconds 00:30:24.540 00:30:24.540 Latency(us) 00:30:24.540 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:24.540 =================================================================================================================== 00:30:24.540 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@973 -- # wait 1759070 00:30:24.540 13:58:38 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:30:24.540 13:58:38 chaining -- bdev/chaining.sh@170 -- # killprocess 1759070 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@949 -- # '[' -z 1759070 ']' 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@953 -- # kill -0 1759070 00:30:24.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 953: kill: (1759070) - No such process 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@976 -- # echo 'Process with pid 1759070 is not found' 00:30:24.540 Process with pid 1759070 is not found 00:30:24.540 13:58:38 chaining -- bdev/chaining.sh@171 -- # wait 1759070 00:30:24.540 13:58:38 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:30:24.540 13:58:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@296 -- # e810=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@297 -- # x722=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@298 -- # mlx=() 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@336 -- # return 1 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:30:24.540 WARNING: No supported devices were found, fallback requested for tcp test 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:30:24.540 Cannot find device "nvmf_tgt_br" 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@155 -- # true 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:30:24.540 Cannot find device "nvmf_tgt_br2" 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@156 -- # true 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:30:24.540 Cannot find device "nvmf_tgt_br" 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@158 -- # true 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:30:24.540 Cannot find device "nvmf_tgt_br2" 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@159 -- # true 00:30:24.540 13:58:38 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:30:24.801 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@162 -- # true 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:30:24.801 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@163 -- # true 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:30:24.801 13:58:39 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:30:25.062 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:30:25.062 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.111 ms 00:30:25.062 00:30:25.062 --- 10.0.0.2 ping statistics --- 00:30:25.062 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:25.062 rtt min/avg/max/mdev = 0.111/0.111/0.111/0.000 ms 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:30:25.062 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:30:25.062 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.096 ms 00:30:25.062 00:30:25.062 --- 10.0.0.3 ping statistics --- 00:30:25.062 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:25.062 rtt min/avg/max/mdev = 0.096/0.096/0.096/0.000 ms 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:30:25.062 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:30:25.062 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.060 ms 00:30:25.062 00:30:25.062 --- 10.0.0.1 ping statistics --- 00:30:25.062 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:25.062 rtt min/avg/max/mdev = 0.060/0.060/0.060/0.000 ms 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@433 -- # return 0 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:30:25.062 13:58:39 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@481 -- # nvmfpid=1760813 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@482 -- # waitforlisten 1760813 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@830 -- # '[' -z 1760813 ']' 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:25.062 13:58:39 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:25.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:25.062 13:58:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:25.324 [2024-06-10 13:58:39.575727] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:25.324 [2024-06-10 13:58:39.575787] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:25.324 [2024-06-10 13:58:39.656839] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:25.324 [2024-06-10 13:58:39.726514] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:30:25.324 [2024-06-10 13:58:39.726549] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:30:25.324 [2024-06-10 13:58:39.726557] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:30:25.324 [2024-06-10 13:58:39.726563] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:30:25.324 [2024-06-10 13:58:39.726568] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:30:25.324 [2024-06-10 13:58:39.726587] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 1 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:26.265 13:58:40 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:26.265 13:58:40 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:30:26.265 13:58:40 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:26.265 malloc0 00:30:26.265 [2024-06-10 13:58:40.472650] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:30:26.265 [2024-06-10 13:58:40.488891] tcp.c: 982:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:26.265 13:58:40 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:30:26.265 13:58:40 chaining -- bdev/chaining.sh@189 -- # bperfpid=1761087 00:30:26.265 13:58:40 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1761087 /var/tmp/bperf.sock 00:30:26.265 13:58:40 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@830 -- # '[' -z 1761087 ']' 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:30:26.265 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:26.265 13:58:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:26.265 [2024-06-10 13:58:40.554039] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:26.265 [2024-06-10 13:58:40.554086] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1761087 ] 00:30:26.265 [2024-06-10 13:58:40.640637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:26.265 [2024-06-10 13:58:40.706666] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:27.205 13:58:41 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:27.205 13:58:41 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:27.205 13:58:41 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:30:27.205 13:58:41 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:30:27.464 [2024-06-10 13:58:41.752951] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:27.464 nvme0n1 00:30:27.464 true 00:30:27.464 crypto0 00:30:27.464 13:58:41 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:30:27.464 Running I/O for 5 seconds... 00:30:32.748 00:30:32.748 Latency(us) 00:30:32.748 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:32.748 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:30:32.748 Verification LBA range: start 0x0 length 0x2000 00:30:32.748 crypto0 : 5.02 9447.22 36.90 0.00 0.00 27017.57 3822.93 24139.09 00:30:32.748 =================================================================================================================== 00:30:32.748 Total : 9447.22 36.90 0.00 0.00 27017.57 3822.93 24139.09 00:30:32.748 0 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:32.748 13:58:46 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@205 -- # sequence=94804 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:32.748 13:58:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@206 -- # encrypt=47402 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:33.008 13:58:47 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@207 -- # decrypt=47402 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@208 -- # crc32c=94804 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:30:33.268 13:58:47 chaining -- bdev/chaining.sh@214 -- # killprocess 1761087 00:30:33.268 13:58:47 chaining -- common/autotest_common.sh@949 -- # '[' -z 1761087 ']' 00:30:33.268 13:58:47 chaining -- common/autotest_common.sh@953 -- # kill -0 1761087 00:30:33.268 13:58:47 chaining -- common/autotest_common.sh@954 -- # uname 00:30:33.268 13:58:47 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:33.268 13:58:47 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1761087 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1761087' 00:30:33.528 killing process with pid 1761087 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@968 -- # kill 1761087 00:30:33.528 Received shutdown signal, test time was about 5.000000 seconds 00:30:33.528 00:30:33.528 Latency(us) 00:30:33.528 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:33.528 =================================================================================================================== 00:30:33.528 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@973 -- # wait 1761087 00:30:33.528 13:58:47 chaining -- bdev/chaining.sh@219 -- # bperfpid=1762425 00:30:33.528 13:58:47 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1762425 /var/tmp/bperf.sock 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@830 -- # '[' -z 1762425 ']' 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:30:33.528 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:33.528 13:58:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:33.528 13:58:47 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:30:33.528 [2024-06-10 13:58:47.938712] Starting SPDK v24.09-pre git sha1 c5b9f923d / DPDK 24.03.0 initialization... 00:30:33.528 [2024-06-10 13:58:47.938760] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1762425 ] 00:30:33.787 [2024-06-10 13:58:48.023326] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:33.787 [2024-06-10 13:58:48.087702] reactor.c: 943:reactor_run: *NOTICE*: Reactor started on core 0 00:30:34.357 13:58:48 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:34.357 13:58:48 chaining -- common/autotest_common.sh@863 -- # return 0 00:30:34.357 13:58:48 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:30:34.357 13:58:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:30:34.927 [2024-06-10 13:58:49.131300] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:34.927 nvme0n1 00:30:34.927 true 00:30:34.927 crypto0 00:30:34.927 13:58:49 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:30:34.927 Running I/O for 5 seconds... 00:30:40.202 00:30:40.202 Latency(us) 00:30:40.202 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.202 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:30:40.202 Verification LBA range: start 0x0 length 0x200 00:30:40.202 crypto0 : 5.00 2112.17 132.01 0.00 0.00 14833.39 757.76 15400.96 00:30:40.202 =================================================================================================================== 00:30:40.202 Total : 2112.17 132.01 0.00 0.00 14833.39 757.76 15400.96 00:30:40.202 0 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@233 -- # sequence=21140 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:40.202 13:58:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@234 -- # encrypt=10570 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@235 -- # decrypt=10570 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:30:40.462 13:58:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:40.722 13:58:55 chaining -- bdev/chaining.sh@236 -- # crc32c=21140 00:30:40.722 13:58:55 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:30:40.722 13:58:55 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:30:40.722 13:58:55 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:30:40.722 13:58:55 chaining -- bdev/chaining.sh@242 -- # killprocess 1762425 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@949 -- # '[' -z 1762425 ']' 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@953 -- # kill -0 1762425 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@954 -- # uname 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1762425 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1762425' 00:30:40.722 killing process with pid 1762425 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@968 -- # kill 1762425 00:30:40.722 Received shutdown signal, test time was about 5.000000 seconds 00:30:40.722 00:30:40.722 Latency(us) 00:30:40.722 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.722 =================================================================================================================== 00:30:40.722 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:40.722 13:58:55 chaining -- common/autotest_common.sh@973 -- # wait 1762425 00:30:40.982 13:58:55 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@117 -- # sync 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@120 -- # set +e 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:30:40.982 rmmod nvme_tcp 00:30:40.982 rmmod nvme_fabrics 00:30:40.982 rmmod nvme_keyring 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@124 -- # set -e 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@125 -- # return 0 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@489 -- # '[' -n 1760813 ']' 00:30:40.982 13:58:55 chaining -- nvmf/common.sh@490 -- # killprocess 1760813 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@949 -- # '[' -z 1760813 ']' 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@953 -- # kill -0 1760813 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@954 -- # uname 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 1760813 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 1760813' 00:30:40.982 killing process with pid 1760813 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@968 -- # kill 1760813 00:30:40.982 13:58:55 chaining -- common/autotest_common.sh@973 -- # wait 1760813 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:41.243 13:58:55 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:41.243 13:58:55 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:41.243 13:58:55 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:30:41.243 13:58:55 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:30:41.243 00:30:41.243 real 0m44.215s 00:30:41.243 user 0m56.563s 00:30:41.243 sys 0m11.558s 00:30:41.243 13:58:55 chaining -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:41.243 13:58:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:41.243 ************************************ 00:30:41.243 END TEST chaining 00:30:41.243 ************************************ 00:30:41.243 13:58:55 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:30:41.243 13:58:55 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:30:41.243 13:58:55 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:30:41.243 13:58:55 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:30:41.243 13:58:55 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:30:41.243 13:58:55 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:30:41.243 13:58:55 -- common/autotest_common.sh@723 -- # xtrace_disable 00:30:41.243 13:58:55 -- common/autotest_common.sh@10 -- # set +x 00:30:41.243 13:58:55 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:30:41.243 13:58:55 -- common/autotest_common.sh@1391 -- # local autotest_es=0 00:30:41.243 13:58:55 -- common/autotest_common.sh@1392 -- # xtrace_disable 00:30:41.243 13:58:55 -- common/autotest_common.sh@10 -- # set +x 00:30:49.377 INFO: APP EXITING 00:30:49.377 INFO: killing all VMs 00:30:49.377 INFO: killing vhost app 00:30:49.377 WARN: no vhost pid file found 00:30:49.377 INFO: EXIT DONE 00:30:52.670 Waiting for block devices as requested 00:30:52.670 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:30:52.670 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:30:52.670 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:30:52.670 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:30:52.931 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:30:52.931 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:30:52.931 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:30:52.931 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:30:53.191 0000:65:00.0 (144d a80a): vfio-pci -> nvme 00:30:53.452 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:30:53.452 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:30:53.452 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:30:53.452 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:30:53.713 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:30:53.713 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:30:53.713 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:30:54.020 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:30:58.272 Cleaning 00:30:58.272 Removing: /var/run/dpdk/spdk0/config 00:30:58.272 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:30:58.272 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:30:58.272 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:30:58.272 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:30:58.272 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:30:58.533 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:30:58.533 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:30:58.533 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:30:58.533 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:30:58.533 Removing: /var/run/dpdk/spdk0/hugepage_info 00:30:58.533 Removing: /dev/shm/nvmf_trace.0 00:30:58.533 Removing: /dev/shm/spdk_tgt_trace.pid1434875 00:30:58.533 Removing: /var/run/dpdk/spdk0 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1433418 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1434875 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1435612 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1436734 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1436993 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1438119 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1438245 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1438576 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1441364 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1442814 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1443188 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1443577 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1443973 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1444361 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1444706 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1445026 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1445282 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1446385 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1450011 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1450356 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1450623 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1450825 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1451058 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1451200 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1451545 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1451893 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1452120 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1452339 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1452622 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1452963 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1453316 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1453660 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1453919 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1454130 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1454386 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1454735 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1455079 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1455420 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1455751 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1455967 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1456185 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1456505 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1456846 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1457193 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1457537 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1457887 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1458244 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1458585 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1458940 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1459297 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1459652 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1459999 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1460333 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1460709 00:30:58.533 Removing: /var/run/dpdk/spdk_pid1461239 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1461600 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1461701 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1466786 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1469697 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1472101 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1473436 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1475032 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1475443 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1475473 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1475494 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1481197 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1481885 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1483217 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1483564 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1490384 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1492470 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1493622 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1498810 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1500952 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1502044 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1507327 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1510738 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1511834 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1523698 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1526417 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1527715 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1539911 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1542650 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1543934 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1556673 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1560765 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1562177 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1575845 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1578822 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1580232 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1594123 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1597407 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1598811 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1612474 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1617228 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1618631 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1620016 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1623993 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1630958 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1634491 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1640431 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1645408 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1652791 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1656671 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1665505 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1668416 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1676337 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1679259 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1687001 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1689947 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1695362 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1695820 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1696367 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1696788 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1697451 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1698410 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1699321 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1699736 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1701532 00:30:58.794 Removing: /var/run/dpdk/spdk_pid1703968 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1705644 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1707046 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1708900 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1710724 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1712607 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1713972 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1714792 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1715228 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1718026 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1720611 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1723330 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1724960 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1726571 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1727312 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1727335 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1727471 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1727764 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1728048 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1729399 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1731758 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1733956 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1735153 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1736234 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1736577 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1736606 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1736640 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1738116 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1738791 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1739779 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1742465 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1745274 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1747892 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1749406 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1751059 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1751741 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1751889 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1756869 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1757223 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1757259 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1757563 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1757646 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1758002 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1759070 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1761087 00:30:59.055 Removing: /var/run/dpdk/spdk_pid1762425 00:30:59.055 Clean 00:30:59.317 13:59:13 -- common/autotest_common.sh@1450 -- # return 0 00:30:59.317 13:59:13 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:30:59.317 13:59:13 -- common/autotest_common.sh@729 -- # xtrace_disable 00:30:59.317 13:59:13 -- common/autotest_common.sh@10 -- # set +x 00:30:59.317 13:59:13 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:30:59.317 13:59:13 -- common/autotest_common.sh@729 -- # xtrace_disable 00:30:59.317 13:59:13 -- common/autotest_common.sh@10 -- # set +x 00:30:59.317 13:59:13 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:30:59.317 13:59:13 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:30:59.317 13:59:13 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:30:59.317 13:59:13 -- spdk/autotest.sh@391 -- # hash lcov 00:30:59.317 13:59:13 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:30:59.317 13:59:13 -- spdk/autotest.sh@393 -- # hostname 00:30:59.317 13:59:13 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-cyp-08 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:30:59.578 geninfo: WARNING: invalid characters removed from testname! 00:31:26.162 13:59:37 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:26.162 13:59:40 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:28.705 13:59:42 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:30.616 13:59:44 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:33.159 13:59:47 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:35.071 13:59:49 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:36.985 13:59:51 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:31:36.985 13:59:51 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:36.985 13:59:51 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:31:36.985 13:59:51 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:36.985 13:59:51 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:36.985 13:59:51 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.985 13:59:51 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.985 13:59:51 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.985 13:59:51 -- paths/export.sh@5 -- $ export PATH 00:31:36.985 13:59:51 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.985 13:59:51 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:36.985 13:59:51 -- common/autobuild_common.sh@437 -- $ date +%s 00:31:36.985 13:59:51 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718020791.XXXXXX 00:31:36.985 13:59:51 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718020791.idwa5t 00:31:36.985 13:59:51 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:31:36.985 13:59:51 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:31:36.985 13:59:51 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:31:36.985 13:59:51 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:31:37.246 13:59:51 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:31:37.246 13:59:51 -- common/autobuild_common.sh@453 -- $ get_config_params 00:31:37.246 13:59:51 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:31:37.246 13:59:51 -- common/autotest_common.sh@10 -- $ set +x 00:31:37.246 13:59:51 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:31:37.246 13:59:51 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:31:37.246 13:59:51 -- pm/common@17 -- $ local monitor 00:31:37.246 13:59:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:37.246 13:59:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:37.246 13:59:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:37.246 13:59:51 -- pm/common@21 -- $ date +%s 00:31:37.246 13:59:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:37.246 13:59:51 -- pm/common@25 -- $ sleep 1 00:31:37.246 13:59:51 -- pm/common@21 -- $ date +%s 00:31:37.246 13:59:51 -- pm/common@21 -- $ date +%s 00:31:37.246 13:59:51 -- pm/common@21 -- $ date +%s 00:31:37.246 13:59:51 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718020791 00:31:37.246 13:59:51 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718020791 00:31:37.246 13:59:51 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718020791 00:31:37.246 13:59:51 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718020791 00:31:37.246 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718020791_collect-vmstat.pm.log 00:31:37.246 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718020791_collect-cpu-load.pm.log 00:31:37.246 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718020791_collect-cpu-temp.pm.log 00:31:37.246 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718020791_collect-bmc-pm.bmc.pm.log 00:31:38.190 13:59:52 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:31:38.190 13:59:52 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j144 00:31:38.190 13:59:52 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:38.190 13:59:52 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:31:38.190 13:59:52 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:31:38.190 13:59:52 -- spdk/autopackage.sh@19 -- $ timing_finish 00:31:38.190 13:59:52 -- common/autotest_common.sh@735 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:31:38.190 13:59:52 -- common/autotest_common.sh@736 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:31:38.190 13:59:52 -- common/autotest_common.sh@738 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:31:38.190 13:59:52 -- spdk/autopackage.sh@20 -- $ exit 0 00:31:38.190 13:59:52 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:31:38.190 13:59:52 -- pm/common@29 -- $ signal_monitor_resources TERM 00:31:38.190 13:59:52 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:31:38.190 13:59:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:38.190 13:59:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:31:38.190 13:59:52 -- pm/common@44 -- $ pid=1776061 00:31:38.190 13:59:52 -- pm/common@50 -- $ kill -TERM 1776061 00:31:38.190 13:59:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:38.191 13:59:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:31:38.191 13:59:52 -- pm/common@44 -- $ pid=1776062 00:31:38.191 13:59:52 -- pm/common@50 -- $ kill -TERM 1776062 00:31:38.191 13:59:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:38.191 13:59:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:31:38.191 13:59:52 -- pm/common@44 -- $ pid=1776064 00:31:38.191 13:59:52 -- pm/common@50 -- $ kill -TERM 1776064 00:31:38.191 13:59:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:38.191 13:59:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:31:38.191 13:59:52 -- pm/common@44 -- $ pid=1776087 00:31:38.191 13:59:52 -- pm/common@50 -- $ sudo -E kill -TERM 1776087 00:31:38.191 + [[ -n 1298994 ]] 00:31:38.191 + sudo kill 1298994 00:31:38.201 [Pipeline] } 00:31:38.219 [Pipeline] // stage 00:31:38.224 [Pipeline] } 00:31:38.240 [Pipeline] // timeout 00:31:38.245 [Pipeline] } 00:31:38.262 [Pipeline] // catchError 00:31:38.266 [Pipeline] } 00:31:38.284 [Pipeline] // wrap 00:31:38.291 [Pipeline] } 00:31:38.307 [Pipeline] // catchError 00:31:38.316 [Pipeline] stage 00:31:38.319 [Pipeline] { (Epilogue) 00:31:38.333 [Pipeline] catchError 00:31:38.335 [Pipeline] { 00:31:38.350 [Pipeline] echo 00:31:38.352 Cleanup processes 00:31:38.358 [Pipeline] sh 00:31:38.649 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:38.649 1776169 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:31:38.649 1776603 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:38.665 [Pipeline] sh 00:31:38.956 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:38.956 ++ grep -v 'sudo pgrep' 00:31:38.956 ++ awk '{print $1}' 00:31:38.956 + sudo kill -9 1776169 00:31:38.971 [Pipeline] sh 00:31:39.260 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:31:54.181 [Pipeline] sh 00:31:54.469 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:31:54.469 Artifacts sizes are good 00:31:54.486 [Pipeline] archiveArtifacts 00:31:54.494 Archiving artifacts 00:31:54.687 [Pipeline] sh 00:31:55.026 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:31:55.043 [Pipeline] cleanWs 00:31:55.053 [WS-CLEANUP] Deleting project workspace... 00:31:55.053 [WS-CLEANUP] Deferred wipeout is used... 00:31:55.060 [WS-CLEANUP] done 00:31:55.063 [Pipeline] } 00:31:55.085 [Pipeline] // catchError 00:31:55.100 [Pipeline] sh 00:31:55.392 + logger -p user.info -t JENKINS-CI 00:31:55.403 [Pipeline] } 00:31:55.421 [Pipeline] // stage 00:31:55.426 [Pipeline] } 00:31:55.444 [Pipeline] // node 00:31:55.450 [Pipeline] End of Pipeline 00:31:55.495 Finished: SUCCESS